MeriTalk https://www.meritalk.com Improving the Outcomes of Government IT Fri, 16 Nov 2018 22:31:45 +0000 en-US hourly 1 https://wordpress.org/?v=4.9.8 How NASA Is Using Ubiquitous Computing to Bring Mars to Earth https://www.meritalk.com/how-nasa-is-using-ubiquitous-computing-to-bring-mars-to-earth/ Tue, 30 Oct 2018 15:00:43 +0000 https://www.meritalk.com/?p=54061 When the term “ubiquitous computing” was first coined in the late 1980s, it was envisioned as a sea of electronic devices “connected by wires, radio waves, and infrared,” that are so widely used, nobody notices their presence. By that definition, the era of ubiquitous computing arrived some time ago. Everything from movie theater projectors to plumbing systems have gone digital, paving the way to meaningful, data-driven change in our daily lives.

A lot of people remain skeptical of the Internet of Things (IoT). They may not realize that they already depend upon it. While IoT often conjures images of futuristic devices – senselessly superpowered refrigerators or embarrassingly smart toilets – there are good reasons why Americans are buying thousands of programmable appliances every month. Smart devices provide an improved service to consumers, and the data we can collect from them are invaluable, with the potential to revolutionize everything from traffic management to agriculture.

It also means that the idea of a computer – as some sort of electronic box typically used at a desk – is woefully outdated. In fact, the new super computer is the one we have in our pockets, the SmartPhone, which is rapidly becoming our ever-present intelligent digital assistant. Ubiquitous computing is changing the way we interact with everyday objects. We don’t flip on light switches, we instruct a digital assistant to turn our lights on for us; or, in some cases, we just push a button on our phones to turn a light on. If a smoke alarm goes off, we don’t need to get out the ladder and manually turn it off – we can see the source of the smoke on our phones and turn the alarm off with a mere swipe.

The Curiosity rover, the most technologically advanced rover ever built, has 17 cameras and a robotic arm with a suite of specialized lab tools and instruments. It required the work of 7,000 people, nationwide, to design and construct it over the course of five years.

At JPL and NASA, one of countless ways ubiquitous computing has woven its way into our work is through augmented reality (AR). Today, if anyone wants an up-close look at Curiosity, they need only use their phones or a pair of smart goggles. JPL built an augmented reality app that allows you to bring Curiosity anywhere – into your own home, conference room, or even backyard. The app lets you walk around the rover and examine it from any angle, as if it were actually there. In addition, scientists can don AR glasses and meet on the surface of Mars to discuss rocks or features – all from their own homes or conference rooms scattered across Earth.

AR may feel like magic to the end user, but it’s not. It’s the culmination of decades of technological advancements. It requires an assortment of sensors (light and motion), cameras, as well as substantial data processing power – power that only became affordable and available via mobile devices in recent years. In fact, we are now seeing the initial swells of a major ubiquitous computing wave hitting our shores within the next few years. The entire wireless networking industry is being revolutionized to meet the needs of exponentially more devices communicating to each other and to us all the time. That’s when we all become IoT magicians in our daily lives and when that second brain (the SmartPhone) fires on all neurons. (More about that in a future blog.)

Now we can use AR for more than just finding Pokemon in the wild – we use it to review and build spacecraft. We can get a detailed look at a vehicle’s hardware without actually taking it apart, or we can see if a hand might fit in a tight space to manually turn a screw. Among the many advantages of augmented reality: It’s cost-efficient for multiple people to work together over a virtual network and it could easily be used for hands-free safety training, testing, or maintenance.

While AR is dependent on a slew of technologies, perhaps the most critical piece is the cloud. A lot of AR applications would be cost prohibitive without the supercomputing power available over the cloud. Based on our experience at JPL, we estimate that serverless computing can be up to 100 times less expensive than N-tier server-based computing. Not surprisingly, we’re now starting to use serverless computing as often as we can. What it really means is that we don’t have to worry about how a problem is solved, we just have to worry about what problems we’re solving. And that’s a powerful position to be in.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

]]>
The Rise of Intelligent Digital Assistants https://www.meritalk.com/the-rise-of-intelligent-digital-assistants/ Tue, 09 Oct 2018 15:27:47 +0000 https://www.meritalk.com/?p=53457 Before we can have a rational discussion about artificial intelligence (AI), we should probably cut through the hysteria surrounding it. Contrary to popular opinion, we’re not talking about machines that are going to rise up, steal our jobs, and render humans obsolete.

We already rely on AI in countless daily tasks. The AI embedded in your email, for example, identifies and blocks spam; it reminds you to include an attachment if you’ve forgotten it; it learns which sorts of emails are most important to you and it pesters you to respond to an email that’s been neglected in your inbox for a few days. Chatbots, robot vacuums, movie recommendation engines, and self-driving cars already have AI built into them. Right now, its impact may be subtle. But, in the future, AI could save lives by, for example, detecting cancer much earlier so treatments can be that much more effective.

Why is AI exploding right now? A unique mix of conditions makes this the right time to surf the AI technology wave. They include the availability of supercomputing power, venture capital, oceans of data, open source software, and programming skills. In this sort of environment, you can train any number of algorithms, prep them and deploy them on smart computers in a very short time. The implications for most major industries – everything from medicine to transportation – are tremendous.

At NASA JPL, we’ve integrated AI into our mission in a variety of ways. We use it to help find interesting features on Mars, predicting maintenance of our antennas, and finding expected problems with the spacecraft. We’ve also come to depend upon intelligent digital assistants in our day-to-day operations. After a brainstorming session about how to experiment with applied intelligent assistance, we decided to throw AI on a mundane – but time-consuming – daily challenge: Finding an available conference room. Later that night, we built a chatbot.

Chatbots are fast, easy to build, and they deliver value almost immediately. We build ours to be easy to access with natural user interfaces. We then add the deeper (and more complex) AI on the back end, so the chatbots get smarter and smarter over time. You can text to them, or type to them, or speak to them through Amazon Alexa or Lex, and we collect user feedback to constantly improve them. Every “thumbs up” or “thumbs down” helps improve the next iteration. We can mine the thumbs down to see which areas need the most work. We now have upwards of 10 emerging chatbots used for functions like acquisitions, AV controls, and even human resources. By thinking of this as a “system of intelligence,” we can extend the life of – and get more value from – the legacy systems by teaching them how to respond to deeper questions. While applied artificial intelligence can conquer any number of menial tasks, it’s bound to have a significant effect on some of our bigger challenges, such as in space exploration, creating new materials to be 3D printed, medicine, manufacturing, security, and education.

AI has especially rich potential in the federal government, where one small operational improvement can have an exponential effect. If you work in a large agency and you’re unsure of how to approach AI, you can start playing in the cloud with any number of cloud-based applications (such as TensorFlow and SageMaker). Chatbots are a natural starting point – they deliver value right away, and the longer you use them, the smarter and more effective they become. In the cloud, experimentation is inexpensive and somewhat effortless. The goal, after all, is to have AI work for you, not the other way around.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

 

]]>
The Next Tech Tsunami Is Here: How to Ride the Wave https://www.meritalk.com/the-next-tech-tsunami-is-here-how-to-ride-the-wave/ Tue, 25 Sep 2018 14:41:15 +0000 https://www.meritalk.com/?p=52989 Our lives are very different today than they were even 15 years ago. We walk around with computers in our pockets and do things like share our locations with loved ones via satellite technology. Still, the changes we’ve experienced so far are nothing compared to the tech tidal wave that’s approaching.

A convergence of computing advancements ranging from supercomputers to machine learning will transform our daily lives. The way we learn, communicate, work, play, and govern in the future will be very different.

The question most IT decision makers are asking themselves right now is what they should do to prepare for the technology waves that are barreling at them. The short answer: Jump in. The only way to stay in the game is to play in the game. A certain amount of failure is inevitable, which is why it’s critical to experiment now and adapt quickly. Given the rate at which technology is evolving, anyone who takes a wait-and-see approach in hopes of finding a perfect, neatly wrapped, risk-free solution will be left out entirely. Or, alternatively, they will be left with a hefty bill for the dubious pleasure of playing catch-up.

The good news: Anyone with a cloud strategy is part-way there. The future of technology is the cloud. Every significant emerging advancement is predicated on it. Without it, the tech revolution would cost too much or take too long. The only practical and affordable way to store, collect, and analyze the massive volumes of data we’ll be bombarded with–data that’s generated by everything from smart watches to self-driving cars–will be through a cloud-based computing system. The tools we will use to make sense of our modern lives will be deployed in the cloud, because it’s the simplest, fastest, and most affordable way to use them.

In this series, we’re going to highlight the six technology waves that we expect will make the biggest impacts in the coming years. They are: Accelerated Computing, Applied Artificial Intelligence, Cybersecurity, Software-Defined Everything, Ubiquitous Computing, and New Habits. Tech waves are plentiful, but not every wave is meant to be surfed by every organization. This series aims to help you distinguish which ones are right for you.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

 

]]>
CDM is Evolving. What’s Next? https://www.meritalk.com/cdm-is-evolving-whats-next/ https://www.meritalk.com/cdm-is-evolving-whats-next/#respond Fri, 31 Aug 2018 15:57:08 +0000 https://www.meritalk.com/?p=51999 OMB’s Risk Report raised the cyber flag again – 71 of 96 Federal agencies are missing “fundamental cybersecurity policies” or have “significant gaps” in their cybersecurity programs. There is no silver bullet, but the Continuous Diagnostics and Mitigation (CDM) program provides an opportunity to continually improve cybersecurity risk postures across the federal government.

During CDM’s first phases, agencies identified the assets and users on the network. Coming up next, the third and fourth phases will focus on proactively identifying cybersecurity risks. The goal is to close the gaps and ultimately enable agencies to prevent attacks.

As agencies work to secure on-premise, cloud, and hybrid environments, CDM is evolving – drawing on lessons learned around procurement, risk scoring, and the importance of visibility into cybersecurity maturity.

DEFEND and AWARE Move Toward Perfecting the Process

DEFEND – Dynamic Evolving Federal Enterprise Network Defense – aims to improve acquisition practices across all CDM phases. DEFEND will expand task orders, increase contract ceilings, and require integrators to consider data quality from the start of each contract to minimize inconsistencies among vendors. Importantly, the new acquisition strategy uses a cost-plus method, encouraging vendors to achieve all requirements.

DHS is also developing the Agency Wide Adaptive Risk Enumeration (AWARE) scoring algorithm. AWARE will help agencies assess key cyber risk issues including patching and network configuration to determine the most critical vulnerabilities. The goal is to enable agencies to prioritize and address priority threats first, track progress, and manage mitigation. In the future, AWARE will serve as a consistent, objective risk measurement tool for monitoring and comparing cyber exposure, giving agencies strategic insight across agencies and departments.

Raising the Grades

The CDM program is designed to shine a light on cyber best practices and help agencies make better decisions by identifying reliable solutions, taking the guess work out of acquisitions. Leveraging these tools, agencies can meet FISMA cyber requirements – key to improving grades on the newest category of the FITARA scorecard. For example, 18 out of 24 agencies received a D or F for this category on the most recent scorecard – therefore, more work ahead.

One Part of a Larger Conversation

Highlighting the importance of these efforts, over the summer, Rep. John Ratcliffe (R-Texas) introduced new CDM legislation “to advance and modernize” the program. If passed, the bill will ensure agency tools remain current as technology advances.

At Dell EMC, we believe CDM efforts must be a top priority for every agency. We view CDM as a permanent capability, not just a program that ends when the phases are complete. CDM is not a compliance checklist, but part of the larger cybersecurity conversation that includes technology, strategies, and best practices. Taking advantage of CDM is a vital step toward shifting the stance from reactive to proactive – preventing cyber breaches and data loss.

As a leader across the full cybersecurity ecosystem, Dell EMC supports federal customers within and beyond the CDM phases. We deliver comprehensive data protection solutions, and we are focused on all aspects of the cyber landscape, including the full supply chain, and the importance of building

NIST and ISO-compliant hardware.

We know the key word is “continuous.” And continuous cyber success drives continuous mission success. As we move forward together, there is a real opportunity to strengthen our cyber defenses and progress within, and beyond, the CDM program.

Read more about our data protection solutions.

By: Jean Edwards, Managing Director, Business Development, Civilian Agencies, Federal Strategic Programs, Dell EMC

]]>
https://www.meritalk.com/cdm-is-evolving-whats-next/feed/ 0
A Golden Age of Government Innovation, Courtesy of The Cloud https://www.meritalk.com/a-golden-age-of-government-innovation-courtesy-of-the-cloud/ https://www.meritalk.com/a-golden-age-of-government-innovation-courtesy-of-the-cloud/#respond Wed, 18 Jul 2018 14:33:01 +0000 https://www.meritalk.com/?p=49636 Federal agencies are not often known as cradles of innovation, but the adoption of the cloud has helped usher in a new era of government–one that improves transparency, cost efficiency and public engagement.

Technology changes constantly, but many bureaucracies do not. In some cases, government organizations that have adopted the cloud still mainly use it as a glorified filing cabinet. Others, however, are finding new and innovative uses for the virtual infrastructure that have made meaningful changes in people’s lives.

Case in point: In the Virginia Beach metropolitan area, the average elevation hovers around 12 feet above sea level. The region, home to the world’s largest naval base, Naval Station Norfolk, is expected to see sea levels rise another foot by 2050, according to one report, and the estimated cost of flooding has already surpassed $4 billion. Given that the sea level has risen an astonishing 14 inches since 1950, it follows that flooding is a consistent problem.

These days, it doesn’t take much rain to cause nuisance flooding. While researchers and engineers are exploring long-term solutions, the City of Virginia Beach helped roll out StormSense, a cloud-based platform that uses a network of sensors to help forecast floods up to 36 hours in advance of major weather events.

It was not a minor undertaking. The public initiative, dubbed “Catch the King”, was meant to encourage “citizen scientists” to use their GPS-enabled devices to help map the King Tide’s maximum inundation extents. The data allows researchers to create more accurate forecast models for future floods; determine new emergency routes; reduce property damage and identify future flooding mitigation projects.

Volunteers–all 500 of them–were directed to different flooding zones along the coast during the King Tide on Nov. 5, 2017. (A “King Tide” is a particularly high tide that tends to occur twice a year, based on the gravitational pull between the sun and moon.) As the King Tide came in, volunteers walked along designated points in 12 Virginia cities, saving data in a mobile phone app every few steps. Ultimately, there were more than 53,000 time-stamped GPS flooding extent measurements taken and 1,126 geotagged photographs of the King Tide flooding.

Not only was the initiative a scientific success, it was incredibly cost effective. Without the help of the hundreds of volunteers, it might have been cost prohibitive to tackle.

“It’s really, really amazing to see all these people out there mapping all over the region,” Skip Stiles, executive director of Wetlands Watch, a Norfolk, Virginia-based told The Virginian-Pilot. “It would take numerous tide gauges costing thousands of dollars each to gather the data collected for free by so many volunteers.”

The initiative, which was backed by seven cities, a county, and a handful of academic partners, began when Virginia Beach’s Public Works department decided to fund a USGS installation of 10 new water-level sensors and weather stations to measure water level, air pressure, wind speed and rainfall. Later, as part of the StormSense initiative, an additional 24 water-level sensors were installed, bringing the total to 40, up from the original 6 sensors that were in place in 2016.

The benefits of the data collection have not been entirely realized yet, but the region is already better prepared for future floods because StormSense operates on a cloud platform, which allows emergency responders to access flooding data from any mobile device. The City of Virginia Beach has also been working to develop an Amazon Alexa-based Application Programming Interface (API) so residents can ask an Amazon Alexa-enabled device for timely and accurate flooding forecasts or emergency routes.

Not long ago, the notion that the government could use the internet to communicate directly with citizens, in real-time, with pertinent, useful or critical information, might have seemed like a pipe dream. These days, thanks to innovative new initiatives, it’s a reality.

]]>
https://www.meritalk.com/a-golden-age-of-government-innovation-courtesy-of-the-cloud/feed/ 0
HCI – The Key to Unlocking Data Center Modernization? https://www.meritalk.com/hci-the-key-to-unlocking-data-center-modernization/ https://www.meritalk.com/hci-the-key-to-unlocking-data-center-modernization/#respond Mon, 09 Jul 2018 16:37:17 +0000 https://www.meritalk.com/?p=49170 The most recent FITARA scorecard – Scorecard 6.0 – is in, and the results aren’t pretty. Since the last scorecard in November 2017, 11 agencies’ grades have declined while only six showed improvement. The Department of Defense received its third straight “F,” while eight other agencies were perilously close to failing, with “D” grades.

But it’s not all bad news, as nine of the 24 rated agencies received either an “A” or a “B” on the Data Center Optimization Initiative (DCOI) portion of the scorecard, which examines agencies’ progress against savings goals, as well as performance in five areas: energy metering, power usage effectiveness, virtualization, server utilization/automation, and facility utilization. There were five “F’s”, but that’s a big step forward from the first scorecard, released in November 2015, when 15 agencies received failing grades.

How can agencies continue to progress? It’s a complex challenge, as they must find a way to bridge the gap between today’s aging, multi-component infrastructures and tomorrow’s modern future. Hyper-converged infrastructure (HCI) might provide a key.

Hyper-Speed to Hyper-Converged

HCI integrates compute, storage, networking, and virtualization technologies – helping agencies move away from siloed systems toward more consolidated, modern, and secure data center infrastructures. Not only does this approach allow agencies to achieve a smaller data center footprint, it also consolidates IT management to a single pane of glass, eliminating the need to manage multiple components within the data center. This is a key benefit in a market with an aging workforce and stiff competition with the private sector for the best IT talent.

HCI also provides true flexibility/scalability to expand only when needed (rather than requiring heavy investments in hardware that will go unused in the beginning of the project) enabling agencies to move from a reactive to a proactive stance by creating an environment with more repeatable and predictable outcomes.

Additionally, HCI adoption could accelerate other modernization efforts – such as automation, advanced analytics, and more – making it possible for federal agencies to truly transform their data centers, resulting in improved security, greater efficiency, and more opportunity for innovation.

For the first time, FITARA Scorecard 6.0 added a new category to evaluate agencies’ progress with the Modernizing Government Technology (MGT) Act, which was signed into law in December. MGT’s centralized revolving capital fund ($100 million allocated this fiscal year) allows agencies to compete for funding for their transformation initiatives. Even more significant, MGT enables agencies to establish their own working capital funds so they can reinvest savings generated in one year across three-year IT modernization initiatives. Previously, money saved in one year was money lost in the next year’s budget.

Agencies with an MGT-specific working capital fund with a CIO in charge of decision-making would receive an “A.” No one is there yet, but three agencies earned a “B” for demonstrating that their efforts to implement a working capital fund in 2018 or 2019 are sincere and in progress. As agencies set up these funds, they would be wise to funnel some of the money into HCI initiatives, and then dump the savings achieved from those programs back into the fund for future transformation initiatives. Our recent research with IDC has shown that organizations can achieve a 6x ROI over five years with HCI solutions.

Full Speed Ahead – A Team Effort

As agencies begin their journey toward HCI adoption, they may face some challenges or concerns around the network requirements. Some federal environments may have very strict, outdated networking guidelines, which can be too restrictive for quick, easy adoption.

To overcome this challenge, agencies must engage the networking teams from the very beginning of the process, and work together to identify any networking roadblocks before implementation. By identifying and addressing these issues at the onset, agencies lay the groundwork for successful adoption.

Early successful HCI implementations have centered around virtual desktop deployments, but agencies are just beginning to scratch the opportunity surface as they start to move mission-critical systems to HCI environments. Overall, HCI will vastly improve government IT efficiency – freeing up time that was previously spent configuring individual compute, storage, and network elements, as well as making sure all the pieces worked together, that can now be spent on strategic programs that directly drive innovation in the data center and beyond.

By: Jeremy Merrill, Global VxRail & VxRack SDDC Product Technologist, Dell EMC | Server and Infrastructure Systems

]]>
https://www.meritalk.com/hci-the-key-to-unlocking-data-center-modernization/feed/ 0
Modernizing Service Delivery with Multi-Cloud https://www.meritalk.com/modernizing-service-delivery-with-multi-cloud/ https://www.meritalk.com/modernizing-service-delivery-with-multi-cloud/#respond Mon, 25 Jun 2018 18:14:59 +0000 https://www.meritalk.com/?p=48716 FITARA, MGT, the latest IT Executive Order: all of these mandates underscore the same theme – our current approach to government IT isn’t working. Agencies are spending 80% of their IT budgets maintaining legacy systems. As data volumes skyrocket, cyber security threats proliferate, and employees and constituents demand near real-time access to information, this model is simply not sustainable.

As agency CIOs work to transition to a modern infrastructure that is more secure and efficient while reducing duplicative, costly IT systems, all roads seem to lead to the cloud. But, the journey isn’t always easy. Obviously, not all workloads are suitable for public cloud consumption. And, even for those that are, challenges abound in classifying, protecting, and migrating data. Most agencies are deploying multi-cloud environments – some combination of public, private, and hybrid clouds – to address their unique needs while still enabling them to benefit from improved information sharing, innovation, and “anytime, anywhere” access.

As agencies make the transition to a multi-cloud environment, here are three things they need to consider:

1) Is Our Infrastructure Modern and Ready?

As agencies transition to a multi-cloud approach, they must ensure the technology and underlying infrastructure in their current data centers is modern and ready for transformation. Agencies cannot layer cloud technology on top of an existing infrastructure that is not designed to work well in a cloud environment.

To get there, the underlying data center architecture must be cloud-native, scalable, resilient, and trusted, so agencies can deliver services to their users that are relevant and secure. In addition, it needs to seamlessly interoperate with and perform like public cloud offerings from Amazon, Microsoft, Google, and others, as well as be able to support new cloud aware applications and workloads.

Finally, agencies need to keep inevitable scale in mind. Infrastructure should be modular – so it can start small and scale at web speed. It should be built to support both legacy and newly developed applications, while fully integrating with the agency’s software-defined cloud management strategy.

2) How Can We Evaluate and Transform our Applications?

Agencies adopting a multi-cloud approach must match their mission critical workloads to the most optimal environment based on mission characteristics. Accomplishing this requires deep analysis and rationalization of current applications and the optimization or re-platforming of applications for their desired target environment.

By assessing the legacy environment, agencies can quickly determine which applications can immediately move to their ideal cloud environment so they can shift their focus to the larger roadmap of application migration efforts that will require new application development or a complete re-write. This rationalization process aligns with agencies’ data center consolidation efforts. They must right-size their on-premises cloud infrastructure as they plan their public or off-premises cloud strategy.

The second part of a successful application transformation requires the adoption of agile DevOps methodologies and platforms. Agencies should seek integrated, turnkey solutions that incorporate developer tools with engineered solutions built for multi-cloud, so they can develop and deploy agile, modern applications built to run in a multi-cloud environment, moving to an agile model of continuous code delivery.

3) Are Our Processes and People Up to the Task?

Finally, the most critical aspect agencies need to address is transforming and automating their current IT processes and service delivery.

As agencies move away from siloed IT services and toward a software-defined infrastructure that is more agile, they need more integration between teams. This requires identifying new roles and swim lanes, as well as developing new skill sets. Many multi-cloud service delivery efforts falter because of improper planning and implementation of new processes end-to-end – from the end-user to the IT stakeholders to the Cloud Management Platform.

Agencies can’t put new technologies and services in place to protect an old way of doing business; they must develop a new operating model that allows them to manage multiple infrastructure offerings as one single set of services. Success depends on an organization’s ability to plan with this future organizational state as the end goal, and to partner with IT delivery partners who can help them get there.

Multi-Cloud, Multiple Benefits

Today’s Federal Agencies, when evaluating suitable cloud solutions recognize there are a wide variety of capabilities available across the various cloud service providers. As a result, many agencies are adopting a multi-cloud approach to gain the flexibility to match workloads and applications to targeted cloud environments that will optimize performance and capacity and drive down costs. This is the greatest benefit of multi-cloud. By addressing the modernization of infrastructure, adopting sound application transformation strategies, and developing new ITSM processes, multi-cloud can provide greater visibility to what’s running where and what they need to effectively support their programs within required timeframes.

Agencies that adopt these multi-cloud models and effectively modernize infrastructure, transform people and processes that support multi-cloud IT models, and rationalize applications to retire, re-build, replace, or migrate their applications become more relevant to the missions they support. What used to take months can now be delivered in days and hours or even minutes, which truly delivers on the promise of transforming government.

By: Diane Santiago, Director, Federal Presales Engineering, Converged Platforms & Solutions Division, Dell EMC

]]>
https://www.meritalk.com/modernizing-service-delivery-with-multi-cloud/feed/ 0
Derailing Old-School Asset Maintenance https://www.meritalk.com/derailing-old-school-asset-maintenance/ https://www.meritalk.com/derailing-old-school-asset-maintenance/#comments Wed, 16 May 2018 18:38:09 +0000 https://www.meritalk.com/?p=46785 Eighteen years into the 21st century, and one of the most important systems contributing to our economy and quality of life is stuck in the past. Transportation infrastructure, which delivers us from point A to point B and back, has yet to catch up with the digital revolution.

It does not have to be that way. The advent of the Internet of Things (IoT) and cloud computing makes the time right to digitize crucial maintenance functions. A technology-based solution is safer and will save time and resources.

A Time article on how technology can better help maintain America’s infrastructure calls out the Washington, D.C. subway maintenance catastrophe as an illustration of the need for change. The technology exists to revolutionize transportation maintenance, and that means creating a high level of integration among data management, analysis, and deployment environments. It is not a leap to say this technology can help build a crash-free system that is always in service.

Eliminating Paper

Previously, transit agencies relied solely on paper-based tracking procedures, using forms and spreadsheets to monitor and manage critical assets. That is not sustainable, especially for Federally funded agencies that typically store seven years’ worth of data. Paper-based systems are also inefficient and costly. Trying to find a specific document quickly in the case of a safety inspection or funding review can be nearly impossible. This manual process is susceptible to inaccuracies and risk. Without a system in place to unify how data is recorded and analyzed, each inspector can have his or her own way of describing things. The resulting information is open to interpretation, enabling systematic irregularities and human error.

Saying Goodbye to Siloes

There has been exciting innovation in using wireless sensors attached to crucial parts of assets–including engines, brakes, and batteries. Those sensors then feed performance information directly into a centralized system. Thresholds and parameters are preset, while APIs collect and process the data into workable and actionable information.

Aside from eliminating error-prone and time-consuming manual processes, deploying this system in the cloud streamlines and integrates information across the organization. This real-time data helps not only to predict, but to pinpoint maintenance needs. That means avoiding removing perfectly working assets from the system.

With intelligent, real-time data, maintenance issues are only addressed when there is an actual issue or if a threshold has been crossed–reducing mechanic time and parts purchases. Lastly, this optimizes traveler communication. By monitoring the state of vehicles and to schedule maintenance, riders can remain informed about transport schedules and locations.

That is what it comes down to: Living in a connected society, where immediacy and information access is constant. Technology is here to bring our aging infrastructure systems into the digital world, and our antiquated maintenance system is a smart place to start.

 

Kevin Price, Technical Product Evangelist and Product Strategist, Infor EAM

 

 

]]>
https://www.meritalk.com/derailing-old-school-asset-maintenance/feed/ 1
DevOps in Government: Balancing Velocity and Security https://www.meritalk.com/devops-in-government-balancing-velocity-and-security/ https://www.meritalk.com/devops-in-government-balancing-velocity-and-security/#respond Tue, 08 May 2018 17:33:49 +0000 https://www.meritalk.com/?p=46527 The Federal government isn’t known for its progressive approach to IT infrastructure, and agencies aren’t usually early tech adopters. Yet, agencies are increasingly deploying cutting-edge DevOps methodologies to achieve agility and reduce operating costs.

The Department of Homeland Security, General Services Administration, Environmental Protection Agency and Veterans Affairs are among those breaking the mold. They’re modernizing IT infrastructures, and taking strong steps forward in the digital transformation journey.

But that doesn’t come without risk. Several government agencies and organizations–including the National Security Agency (NSA), Pentagon, Republican National Committee and others–have experienced firsthand what can go wrong when environments aren’t properly secured.

The NSA, for example, rocked headlines late last year when it made top secret data publicly accessible to the world on an Amazon Web Services (AWS) bucket. A simple misconfiguration credited to human error was to blame.

The truth of the matter is DevOps tools often have interfaces designed for human users, and misconfigurations are all too easy and common. Some of the most notable breaches can be traced back to misconfigurations of the perimeter, making it all the more important that security controls are implemented across identities and environments.

Introducing Risk Through An Expanded Attack Surface

As agencies adopt new cloud and DevOps environments, they expand their attack surfaces, creating heightened levels of risk. To mitigate risk against internal and external threats, agencies need to continuously monitor privileged account sessions across every aspect of their network–including DevOps.

The DevOps pipeline comprises a broad set of development, integration, testing and deployment tools, people and resources, so it only makes sense that the attack surface grows alongside IT network expansion. This expanded attack surface is primarily propagated by the increase in privileged account credentials and secrets that are created and shared across interconnected access points. Agencies need to secure these non-human identities just like they would a human identity. Robotic actors can be compromised, and they need access controls just like their human counterparts.

That’s not always such an easy feat, however. The sheer scale and diversity of the DevOps ecosystem can make security challenging for three main reasons:

  1. Each development and test tool, configuration management platform and service orchestration solution has its own privileged credentials, which are usually separately maintained and administered using different systems, creating islands of security.
  2. Secrets (passwords, SSH keys, API keys, etc.) used to authenticate exchanges and encrypt transactions are scattered across machines and applications, making them nearly impossible to track and manage.
  3. Developers often hard code secrets into executables, leaving the Federal government vulnerable to attacks and exposure of confidential data from attackers with stolen secrets.

Although security can be a major pain point when it comes to DevOps implementation, not all is lost. Government agencies have the potential to achieve both velocity and security. The answer lies in secrets management and collaboration.

Lifting the Curtain on Secrets Management

Secrets are integral to the DevOps workflow, but their proliferation across IT environments can have unintended, potentially catastrophic consequences if exploited by attackers.

A secrets management solution can help prevent that from happening. By implementing a tool that can seamlessly connect with DevOps tools and other enterprise security solutions, Federal agencies can get a better view of unmanaged, unprotected secrets across their networks, while still meeting important compliance regulations.

With a prioritization on secrets management, Federal agencies can secure and manage secrets used across human and non-human identities and still achieve superior DevOps agility and velocity.

Eliminating Friction and Prioritizing Collaboration

Agencies and organizations alike often fail to make security easy for DevOps practitioners. Not only does that cause friction, it creates opportunity for failure.

Developers aren’t–nor should they be expected to be–security practitioners. They’re responsible for features and functionality–not figuring out how to manage credential collaboration and security for those key assets.

With that being said, it’s essential that DevOps and security teams be tightly integrated from the outset. This collaborative approach will help build a scalable security platform that is constantly improved as new iterations of tools are developed, tested and released.

Implementing and securing DevOps processes can seem daunting, but it’s no reason to adhere to business as usual and avoid change. When it comes to DevOps, the benefits far outweigh the risk if risk is managed properly.

That’s why it’s so important that agencies prioritize secrets management and collaboration to protect every aspect of their network. Only then will they be able to achieve security and velocity.

Elizabeth Lawler is vice president of DevOps security at CyberArk. She co-founded and served as CEO of Conjur, a DevOps security company acquired by CyberArk in May 2017. Elizabeth has more than 20 years of experience working in highly regulated and sensitive data environments. Prior to founding Conjur, she was chief data officer of Generation Health and held a leadership position in research at the Department of Veterans Affairs.

]]>
https://www.meritalk.com/devops-in-government-balancing-velocity-and-security/feed/ 0
It’s Not About the Machines – How IoT, AI, and Massive Automation Maximize Human Potential https://www.meritalk.com/its-not-about-the-machines-how-iot-ai-and-massive-automation-maximize-human-potential/ https://www.meritalk.com/its-not-about-the-machines-how-iot-ai-and-massive-automation-maximize-human-potential/#respond Mon, 19 Mar 2018 14:38:47 +0000 https://www.meritalk.com/?p=44917 There are currently more than 8 billion connected devices on the planet, and by 2031 the number of devices will grow to more than 200 billion.* While the federal government is in the early stages of adopting the Internet of Things (IoT) and Artificial Intelligence (AI), it is increasingly evident these technologies will create new opportunities to maximize human potential and modernize federal missions.

Agencies are already making demonstrable progress with emerging technology. Consider how and where IoT is in use. Federal buildings are equipped to maximize energy efficiency and employee productivity using a variety of sensors and monitors. Fleet telematics monitor the location and performance of vehicles in the field and automatically schedule service as needed. Smart devices automate agricultural data collection and track how public transportation systems handle peak travel times. And, the Department of Defense uses IoT technologies to track military supply levels, battlefield conditions, and even soldiers’ vital signs, activities, and sleep quality.

When IoT is paired with automation and AI, the potential is even more transformative – a government that can fully leverage the vast amount of data at its fingertips, in real time. Dell’s research partners at the Institute for the Future (IFTF) recently noted that we’re entering the next era of human machine partnership. AI, IoT, and automation will empower the workforce by efficiently choosing the most important information and enabling teams to quickly make better decisions that reduce costs and improve service to the citizen.

AI will also open new doors for humans, as we will need IT professionals proficient in AI training and parameter-setting, and many new roles that we have not yet imagined.

Ray O’Farrell, Dell’s new IoT division general manager says, “Harnessing the full potential of IoT requires adding intelligence to every stage. Smarter data gathering at the edge, smarter compute at the core, and deeper learning at the cloud. All this intelligence will drive a new level of innovation.”

Federal agencies need this innovation as they work to overcome the limits of legacy technology and meet changing citizen and employee expectations around responsive services, access to information, and many other areas. Over the next several years, the human-machine interface will evolve to the point where technology becomes a seamless extension of its users, rather than merely a functional tool. The real potential for IoT and AI is not to replace human beings, but to free human beings to do the strategic thinking and planning that has taken a back seat to managing the nuts and bolts of technology and federal missions.

By: Cameron Chehreh, Chief Operating Officer, Chief Technology Officer & VP, Dell EMC Federal

 

*Michael Dell, Dell World 2017

]]>
https://www.meritalk.com/its-not-about-the-machines-how-iot-ai-and-massive-automation-maximize-human-potential/feed/ 0