MeriTalk https://www.meritalk.com Improving the Outcomes of Government IT Wed, 20 Feb 2019 22:40:42 +0000 en-US hourly 1 https://wordpress.org/?v=5.0 Why Cyber Security and Cloud Computing Personnel Should Be BFFs https://www.meritalk.com/why-cyber-security-and-cloud-computing-personnel-should-be-bffs/ Tue, 12 Feb 2019 16:00:11 +0000 https://www.meritalk.com/?p=58566 Keeping up with hackers is no small task, particularly since some attacks can be sustained in near-perpetuity. No doubt, securing cyberspace is a daunting responsibility. But focusing on security and using emerging technologies can help us meet the challenges. In particular, when the cyber security personnel of cloud providers and cloud customers work closely together, they become a force-multiplier, proactively defending the enterprise from attacks, and reacting more quickly when breaches occur.

Thanks to advancements in artificial intelligence (AI) and supercomputing, federal networks could soon be entirely self-fortifying and self-healing. The goal is to use AI-driven algorithms to autonomously detect hackers, defend the network from cyber-attacks, and automatically respond at lightning speed to an assault. Until that day, though, we rely on a handful of tools to defend our networks from phishing, vishing, or Distributed Denial of Service (DDoS) attacks.

The burden of securing a network and the devices no longer falls exclusively on systems administrators or security professionals but requires the active participation of every single network user. Continuous user training, especially role-based training, in which individuals receive customized security courses based on their specific tasks, is a good supplementary defense. A senior or C-level executive, for example, may need to be trained to identify suspicious emails that contain fund transfer requests; IT professionals may receive more technical training about different types of attacks and possible responses.

Role-based training isn’t enough in itself, though. We’re exploring how to incorporate emerging technologies into our overall cyber security strategy. Default encryption, in which data is encrypted by default during transit and at rest, is now possible (if not commonplace) in the cloud. We are now calling on all industry innovators to advance the industry so that memory can be encrypted and protected as well. In fact, by partnering our own cyber security teams with those of the cloud providers, we can significantly advance our cyber security defense. So, how do we combat all the scary bot attacks we hear about? Well, why not use the intelligence and data gathered from the Internet of Things (IoT) sensors and the power of AI and the cloud to predict, act, and protect our assets? Why not, indeed.

Blockchain, an incorruptible distributed-verification ledger, could be useful to ensure that the data has not been tampered with. While we’re taking a wait-and-see approach to blockchain, the potential is rich and the technology is developing at a tremendous pace. Several new database announcements from the cloud vendors may help provide many of these benefits and are worth investigating.

There are countless budding technologies that may become integral to our cyber security infrastructure in the future – everything from biometrics to quantum computing. The latter could have huge implications on both encryption and decryption. But every tool we use today requires that we have the agility and ability to move resources and experiment at a moment’s notice. Securing the network isn’t just a matter of protecting corporate secrets – in the case of federal organizations, it’s a matter of defending national interests. The cloud provides the computing power and scalability to secure our most valuable assets. We now have to step up to the plate to build in cyber security into everything we do – because when the easiest path for the end user is also the most secure, we’re on the right one.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

 

]]>
Cheaper, Faster, Smarter: Welcome to the Age of Software Defined Everything https://www.meritalk.com/cheaper-faster-smarter-welcome-to-the-age-of-software-defined-everything/ Tue, 11 Dec 2018 15:59:18 +0000 https://www.meritalk.com/?p=55295 The benefits of Software Defined Everything (SDE) – in which a physical infrastructure is virtualized and delivered as services on demand – will change the way we design, build, and test new applications. In a virtual environment, operational costs drop precipitously as the pace of innovation accelerates. Containerization – a way to virtually run separate applications without dedicating specific machines for each application – allows organizations to build, fail, test, repair, improve, and deploy new apps at a breakneck pace. Application Programming Interfaces (APIs) allow us to build and reuse other peoples software packages quickly and inexpensively.

What’s the growth path for programmers in the US? I’ve seen many numbers and predictions, and I think most are too conservative. Why? Because the nature of “programming” is expanding. JPL, NASA, Facebook, Uber, etc., all employ traditional software engineers. And we will still need even more. But what about all those who program Alexa or Google Home, or program their 3D printers, or their home automation systems, or compose music on their computers? They are using software to define their environment and that’s the essense of SDE, where everything becomes programmable or configurable.

If programming will become democratized, what will be the most popular languages? Today, Python seems to be growing the fastest. Tomorrow, I think it’ll be APIs that will prove the most valuable because they can access pre-written software libraries and change their own environment in minutes. This assumes that people can get to the underlying data and combine it with other data. The cloud will come to the rescue as most data will already be stored there. The companies that are able to take advantage of this agilty – and use data from inside and outside their organizations – will quickly establish a competitive advantage.

If an environment can be accessed through software, it can be automated through artificial intelligence (AI). The combination of: data available in the cloud; accessible APIs; Internet of Things; automation skills; AI skills; open source code available; and distributed serverless computing to inexpensively execute the automation will speed progress towards a self-configuring and self-healing environment for the companies that will become the leaders.

SDE, for example, is one of the most technologically revolutionary things we’re adopting as it promises to dramatically decrease the cost and increase the agility of dealing with our networks. The cloud enables these changes, in part, because capacity and bandwidth can be sliced on the fly. It’s a departure from what we see today, where network engineers set up a physical infrastructure that can be narrowly used by specific people for specific purposes.

SDE touches upon every aspect of an organization’s technology strategy, ranging from security to storage. It will allow systems administrators to view and manage an entire network over a single screen. Increasingly, it also allows networks to run themselves, particularly in cases where there is simply not enough manpower for the job. Netflix, for example, found, in at least one region, it was making a production change every 1.1 or 1.2 seconds. The solution? The company built a self-healing network that automatically monitors production environments and makes real-time operational decisions based on problems that are identified through AI.

The move toward SDE also means that proprietary technology – such as application programming interfaces (APIs) – will become differentiating factors for businesses. In fact, it will become so important, we may see programming taught in schools, along with reading, writing, and arithmetic. Those coding skills could be used to customize anything, from home appliances to self-driving cars.

It’s reasonable to assume that in the not-too-distant future, federal agencies will run software-defined networks that are auto-scaling, self-fortifying, and self-healing. As precious resources are freed up, government organizations can better solve real-life problems, closer to agencies’ missions, and further away from the labor-intensive effort of maintaining a physical infrastructure. And, of course, it doesn’t hurt that it will save the government millions of dollars in the process.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

]]>
How Can We Benefit from Changing Work Habits? https://www.meritalk.com/how-can-we-benefit-from-changing-work-habits/ Tue, 27 Nov 2018 15:03:30 +0000 https://www.meritalk.com/?p=54892 It’s hard to know for certain what the U.S. economy will look like in 10 years’ time, much less what the work day will look like. One thing we do know: As younger generations enter the workforce, they will bring new habits, technology, conventions, and expectations that will likely transform business and government.

With the rise of the “gig economy,” (that is, the rise of independent contractors who work on a task-by-task basis) younger generations already expect a certain amount of freedom and autonomy in their careers. And that’s probably a reasonable expectation, given that they walk around with supercomputers in their pockets, just a click or voice command away from talking with anyone – or controlling equipment – at any time.

Just as millennials eschew traditionally wired conventions, such as cable TV and fixed phone lines, but embrace the sharing economy, such as ridesharing, and open work spaces, the next generation of workers may not value the same things that people want today. They may not be motivated by a base salary, corner office, or a lofty title.

Businesses may also want to provide customized work experiences for their employees in the not-too distant future. Instead of an email notification for an upcoming meeting, an intelligent digital assistant may remind an employee of the meeting and automatically book transportation to the meeting based on the individual’s schedule, traffic conditions, and so on. While this already exists, the scope will be greatly expanded.

Gamification strategies could play a key part in helping businesses better understand what drives its employees. When we talk about gamification, we’re not talking about turning everyday tasks into cartoonish video games, but rather figuring out how to keep employees engaged. This includes setting up structures that allow them to compete against themselves or others (individually or in teams) in the completion of certain tasks or training exercises. They will likely use open spaces and Internet of Things (IoT) with wearables, augmented reality, and in-the-room sensors to collect data. The outcomes or effectiveness will be measured and communicated in real time using analytics employing massive amounts of data from both inside and outside the organization, all stored in the cloud. JPL has already tested this with scientists and operators navigating the Mars rovers, and it has proved to be very effective.

At JPL, we’re trying to meet the changing needs of the business by benefitting from these trends. For example, to quickly and inexpensively help solve a problem of finding parking at JPL, we held a month-long hackathon where teams of interns created prototype mobile phone solutions. There were two winning teams and their ideas were incorporated into a mobile solution now used every day by JPLers. As you approach JPL, your mobile phone will speak and tell you where there is parking available. You can also look at historical data to know what time to leave home or when to go to lunch so you can find parking when you arrive. The data is also used to predict parking during public events.

Other examples include interacting with intelligent digital assistants for many things, including finding conference rooms, hearing what’s happing on Mars, or when the International Space Station will be overhead.

As we approach what will likely be a rich and exciting decade, the most important thing organizations can do right now is lay the groundwork for change in two areas.

Technologically speaking, that means embracing cloud computing, IoT, and the wireless network improvement waves that are rapidly approaching. In particular, serverless computing is cost-effective and it allows organizations to innovate and to experiment with things like artificial intelligence or augmented reality. Edge computing allows business to employ these capabilities at huge scale and speed, which will further solidify the real time gamification and wireless communication future that our next generation will expect.

Humanologically speaking (yes, I know that’s not a word, but perhaps it should be), we can set up innovation labs and experiment in our own environments to quickly decide what to abandon and where to double down. From our experience so far, you will find willing participants in the new workforce and experiments in a safe, protected environment will serve as training opportunities and quickly evolve to produce lasting positive outcomes. And, in case you wonder, yes, it’s fun.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

]]>
How NASA Is Using Ubiquitous Computing to Bring Mars to Earth https://www.meritalk.com/how-nasa-is-using-ubiquitous-computing-to-bring-mars-to-earth/ Tue, 30 Oct 2018 15:00:43 +0000 https://www.meritalk.com/?p=54061 When the term “ubiquitous computing” was first coined in the late 1980s, it was envisioned as a sea of electronic devices “connected by wires, radio waves, and infrared,” that are so widely used, nobody notices their presence. By that definition, the era of ubiquitous computing arrived some time ago. Everything from movie theater projectors to plumbing systems have gone digital, paving the way to meaningful, data-driven change in our daily lives.

A lot of people remain skeptical of the Internet of Things (IoT). They may not realize that they already depend upon it. While IoT often conjures images of futuristic devices – senselessly superpowered refrigerators or embarrassingly smart toilets – there are good reasons why Americans are buying thousands of programmable appliances every month. Smart devices provide an improved service to consumers, and the data we can collect from them are invaluable, with the potential to revolutionize everything from traffic management to agriculture.

It also means that the idea of a computer – as some sort of electronic box typically used at a desk – is woefully outdated. In fact, the new super computer is the one we have in our pockets, the SmartPhone, which is rapidly becoming our ever-present intelligent digital assistant. Ubiquitous computing is changing the way we interact with everyday objects. We don’t flip on light switches, we instruct a digital assistant to turn our lights on for us; or, in some cases, we just push a button on our phones to turn a light on. If a smoke alarm goes off, we don’t need to get out the ladder and manually turn it off – we can see the source of the smoke on our phones and turn the alarm off with a mere swipe.

The Curiosity rover, the most technologically advanced rover ever built, has 17 cameras and a robotic arm with a suite of specialized lab tools and instruments. It required the work of 7,000 people, nationwide, to design and construct it over the course of five years.

At JPL and NASA, one of countless ways ubiquitous computing has woven its way into our work is through augmented reality (AR). Today, if anyone wants an up-close look at Curiosity, they need only use their phones or a pair of smart goggles. JPL built an augmented reality app that allows you to bring Curiosity anywhere – into your own home, conference room, or even backyard. The app lets you walk around the rover and examine it from any angle, as if it were actually there. In addition, scientists can don AR glasses and meet on the surface of Mars to discuss rocks or features – all from their own homes or conference rooms scattered across Earth.

AR may feel like magic to the end user, but it’s not. It’s the culmination of decades of technological advancements. It requires an assortment of sensors (light and motion), cameras, as well as substantial data processing power – power that only became affordable and available via mobile devices in recent years. In fact, we are now seeing the initial swells of a major ubiquitous computing wave hitting our shores within the next few years. The entire wireless networking industry is being revolutionized to meet the needs of exponentially more devices communicating to each other and to us all the time. That’s when we all become IoT magicians in our daily lives and when that second brain (the SmartPhone) fires on all neurons. (More about that in a future blog.)

Now we can use AR for more than just finding Pokemon in the wild – we use it to review and build spacecraft. We can get a detailed look at a vehicle’s hardware without actually taking it apart, or we can see if a hand might fit in a tight space to manually turn a screw. Among the many advantages of augmented reality: It’s cost-efficient for multiple people to work together over a virtual network and it could easily be used for hands-free safety training, testing, or maintenance.

While AR is dependent on a slew of technologies, perhaps the most critical piece is the cloud. A lot of AR applications would be cost prohibitive without the supercomputing power available over the cloud. Based on our experience at JPL, we estimate that serverless computing can be up to 100 times less expensive than N-tier server-based computing. Not surprisingly, we’re now starting to use serverless computing as often as we can. What it really means is that we don’t have to worry about how a problem is solved, we just have to worry about what problems we’re solving. And that’s a powerful position to be in.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

]]>
The Rise of Intelligent Digital Assistants https://www.meritalk.com/the-rise-of-intelligent-digital-assistants/ Tue, 09 Oct 2018 15:27:47 +0000 https://www.meritalk.com/?p=53457 Before we can have a rational discussion about artificial intelligence (AI), we should probably cut through the hysteria surrounding it. Contrary to popular opinion, we’re not talking about machines that are going to rise up, steal our jobs, and render humans obsolete.

We already rely on AI in countless daily tasks. The AI embedded in your email, for example, identifies and blocks spam; it reminds you to include an attachment if you’ve forgotten it; it learns which sorts of emails are most important to you and it pesters you to respond to an email that’s been neglected in your inbox for a few days. Chatbots, robot vacuums, movie recommendation engines, and self-driving cars already have AI built into them. Right now, its impact may be subtle. But, in the future, AI could save lives by, for example, detecting cancer much earlier so treatments can be that much more effective.

Why is AI exploding right now? A unique mix of conditions makes this the right time to surf the AI technology wave. They include the availability of supercomputing power, venture capital, oceans of data, open source software, and programming skills. In this sort of environment, you can train any number of algorithms, prep them and deploy them on smart computers in a very short time. The implications for most major industries – everything from medicine to transportation – are tremendous.

At NASA JPL, we’ve integrated AI into our mission in a variety of ways. We use it to help find interesting features on Mars, predicting maintenance of our antennas, and finding expected problems with the spacecraft. We’ve also come to depend upon intelligent digital assistants in our day-to-day operations. After a brainstorming session about how to experiment with applied intelligent assistance, we decided to throw AI on a mundane – but time-consuming – daily challenge: Finding an available conference room. Later that night, we built a chatbot.

Chatbots are fast, easy to build, and they deliver value almost immediately. We build ours to be easy to access with natural user interfaces. We then add the deeper (and more complex) AI on the back end, so the chatbots get smarter and smarter over time. You can text to them, or type to them, or speak to them through Amazon Alexa or Lex, and we collect user feedback to constantly improve them. Every “thumbs up” or “thumbs down” helps improve the next iteration. We can mine the thumbs down to see which areas need the most work. We now have upwards of 10 emerging chatbots used for functions like acquisitions, AV controls, and even human resources. By thinking of this as a “system of intelligence,” we can extend the life of – and get more value from – the legacy systems by teaching them how to respond to deeper questions. While applied artificial intelligence can conquer any number of menial tasks, it’s bound to have a significant effect on some of our bigger challenges, such as in space exploration, creating new materials to be 3D printed, medicine, manufacturing, security, and education.

AI has especially rich potential in the federal government, where one small operational improvement can have an exponential effect. If you work in a large agency and you’re unsure of how to approach AI, you can start playing in the cloud with any number of cloud-based applications (such as TensorFlow and SageMaker). Chatbots are a natural starting point – they deliver value right away, and the longer you use them, the smarter and more effective they become. In the cloud, experimentation is inexpensive and somewhat effortless. The goal, after all, is to have AI work for you, not the other way around.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

 

]]>
The Next Tech Tsunami Is Here: How to Ride the Wave https://www.meritalk.com/the-next-tech-tsunami-is-here-how-to-ride-the-wave/ Tue, 25 Sep 2018 14:41:15 +0000 https://www.meritalk.com/?p=52989 Our lives are very different today than they were even 15 years ago. We walk around with computers in our pockets and do things like share our locations with loved ones via satellite technology. Still, the changes we’ve experienced so far are nothing compared to the tech tidal wave that’s approaching.

A convergence of computing advancements ranging from supercomputers to machine learning will transform our daily lives. The way we learn, communicate, work, play, and govern in the future will be very different.

The question most IT decision makers are asking themselves right now is what they should do to prepare for the technology waves that are barreling at them. The short answer: Jump in. The only way to stay in the game is to play in the game. A certain amount of failure is inevitable, which is why it’s critical to experiment now and adapt quickly. Given the rate at which technology is evolving, anyone who takes a wait-and-see approach in hopes of finding a perfect, neatly wrapped, risk-free solution will be left out entirely. Or, alternatively, they will be left with a hefty bill for the dubious pleasure of playing catch-up.

The good news: Anyone with a cloud strategy is part-way there. The future of technology is the cloud. Every significant emerging advancement is predicated on it. Without it, the tech revolution would cost too much or take too long. The only practical and affordable way to store, collect, and analyze the massive volumes of data we’ll be bombarded with–data that’s generated by everything from smart watches to self-driving cars–will be through a cloud-based computing system. The tools we will use to make sense of our modern lives will be deployed in the cloud, because it’s the simplest, fastest, and most affordable way to use them.

In this series, we’re going to highlight the six technology waves that we expect will make the biggest impacts in the coming years. They are: Accelerated Computing, Applied Artificial Intelligence, Cybersecurity, Software-Defined Everything, Ubiquitous Computing, and New Habits. Tech waves are plentiful, but not every wave is meant to be surfed by every organization. This series aims to help you distinguish which ones are right for you.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

 

]]>
CDM is Evolving. What’s Next? https://www.meritalk.com/cdm-is-evolving-whats-next/ https://www.meritalk.com/cdm-is-evolving-whats-next/#respond Fri, 31 Aug 2018 15:57:08 +0000 https://www.meritalk.com/?p=51999 OMB’s Risk Report raised the cyber flag again – 71 of 96 Federal agencies are missing “fundamental cybersecurity policies” or have “significant gaps” in their cybersecurity programs. There is no silver bullet, but the Continuous Diagnostics and Mitigation (CDM) program provides an opportunity to continually improve cybersecurity risk postures across the federal government.

During CDM’s first phases, agencies identified the assets and users on the network. Coming up next, the third and fourth phases will focus on proactively identifying cybersecurity risks. The goal is to close the gaps and ultimately enable agencies to prevent attacks.

As agencies work to secure on-premise, cloud, and hybrid environments, CDM is evolving – drawing on lessons learned around procurement, risk scoring, and the importance of visibility into cybersecurity maturity.

DEFEND and AWARE Move Toward Perfecting the Process

DEFEND – Dynamic Evolving Federal Enterprise Network Defense – aims to improve acquisition practices across all CDM phases. DEFEND will expand task orders, increase contract ceilings, and require integrators to consider data quality from the start of each contract to minimize inconsistencies among vendors. Importantly, the new acquisition strategy uses a cost-plus method, encouraging vendors to achieve all requirements.

DHS is also developing the Agency Wide Adaptive Risk Enumeration (AWARE) scoring algorithm. AWARE will help agencies assess key cyber risk issues including patching and network configuration to determine the most critical vulnerabilities. The goal is to enable agencies to prioritize and address priority threats first, track progress, and manage mitigation. In the future, AWARE will serve as a consistent, objective risk measurement tool for monitoring and comparing cyber exposure, giving agencies strategic insight across agencies and departments.

Raising the Grades

The CDM program is designed to shine a light on cyber best practices and help agencies make better decisions by identifying reliable solutions, taking the guess work out of acquisitions. Leveraging these tools, agencies can meet FISMA cyber requirements – key to improving grades on the newest category of the FITARA scorecard. For example, 18 out of 24 agencies received a D or F for this category on the most recent scorecard – therefore, more work ahead.

One Part of a Larger Conversation

Highlighting the importance of these efforts, over the summer, Rep. John Ratcliffe (R-Texas) introduced new CDM legislation “to advance and modernize” the program. If passed, the bill will ensure agency tools remain current as technology advances.

At Dell EMC, we believe CDM efforts must be a top priority for every agency. We view CDM as a permanent capability, not just a program that ends when the phases are complete. CDM is not a compliance checklist, but part of the larger cybersecurity conversation that includes technology, strategies, and best practices. Taking advantage of CDM is a vital step toward shifting the stance from reactive to proactive – preventing cyber breaches and data loss.

As a leader across the full cybersecurity ecosystem, Dell EMC supports federal customers within and beyond the CDM phases. We deliver comprehensive data protection solutions, and we are focused on all aspects of the cyber landscape, including the full supply chain, and the importance of building

NIST and ISO-compliant hardware.

We know the key word is “continuous.” And continuous cyber success drives continuous mission success. As we move forward together, there is a real opportunity to strengthen our cyber defenses and progress within, and beyond, the CDM program.

Read more about our data protection solutions.

By: Jean Edwards, Managing Director, Business Development, Civilian Agencies, Federal Strategic Programs, Dell EMC

]]>
https://www.meritalk.com/cdm-is-evolving-whats-next/feed/ 0
A Golden Age of Government Innovation, Courtesy of The Cloud https://www.meritalk.com/a-golden-age-of-government-innovation-courtesy-of-the-cloud/ https://www.meritalk.com/a-golden-age-of-government-innovation-courtesy-of-the-cloud/#respond Wed, 18 Jul 2018 14:33:01 +0000 https://www.meritalk.com/?p=49636 Federal agencies are not often known as cradles of innovation, but the adoption of the cloud has helped usher in a new era of government–one that improves transparency, cost efficiency and public engagement.

Technology changes constantly, but many bureaucracies do not. In some cases, government organizations that have adopted the cloud still mainly use it as a glorified filing cabinet. Others, however, are finding new and innovative uses for the virtual infrastructure that have made meaningful changes in people’s lives.

Case in point: In the Virginia Beach metropolitan area, the average elevation hovers around 12 feet above sea level. The region, home to the world’s largest naval base, Naval Station Norfolk, is expected to see sea levels rise another foot by 2050, according to one report, and the estimated cost of flooding has already surpassed $4 billion. Given that the sea level has risen an astonishing 14 inches since 1950, it follows that flooding is a consistent problem.

These days, it doesn’t take much rain to cause nuisance flooding. While researchers and engineers are exploring long-term solutions, the City of Virginia Beach helped roll out StormSense, a cloud-based platform that uses a network of sensors to help forecast floods up to 36 hours in advance of major weather events.

It was not a minor undertaking. The public initiative, dubbed “Catch the King”, was meant to encourage “citizen scientists” to use their GPS-enabled devices to help map the King Tide’s maximum inundation extents. The data allows researchers to create more accurate forecast models for future floods; determine new emergency routes; reduce property damage and identify future flooding mitigation projects.

Volunteers–all 500 of them–were directed to different flooding zones along the coast during the King Tide on Nov. 5, 2017. (A “King Tide” is a particularly high tide that tends to occur twice a year, based on the gravitational pull between the sun and moon.) As the King Tide came in, volunteers walked along designated points in 12 Virginia cities, saving data in a mobile phone app every few steps. Ultimately, there were more than 53,000 time-stamped GPS flooding extent measurements taken and 1,126 geotagged photographs of the King Tide flooding.

Not only was the initiative a scientific success, it was incredibly cost effective. Without the help of the hundreds of volunteers, it might have been cost prohibitive to tackle.

“It’s really, really amazing to see all these people out there mapping all over the region,” Skip Stiles, executive director of Wetlands Watch, a Norfolk, Virginia-based told The Virginian-Pilot. “It would take numerous tide gauges costing thousands of dollars each to gather the data collected for free by so many volunteers.”

The initiative, which was backed by seven cities, a county, and a handful of academic partners, began when Virginia Beach’s Public Works department decided to fund a USGS installation of 10 new water-level sensors and weather stations to measure water level, air pressure, wind speed and rainfall. Later, as part of the StormSense initiative, an additional 24 water-level sensors were installed, bringing the total to 40, up from the original 6 sensors that were in place in 2016.

The benefits of the data collection have not been entirely realized yet, but the region is already better prepared for future floods because StormSense operates on a cloud platform, which allows emergency responders to access flooding data from any mobile device. The City of Virginia Beach has also been working to develop an Amazon Alexa-based Application Programming Interface (API) so residents can ask an Amazon Alexa-enabled device for timely and accurate flooding forecasts or emergency routes.

Not long ago, the notion that the government could use the internet to communicate directly with citizens, in real-time, with pertinent, useful or critical information, might have seemed like a pipe dream. These days, thanks to innovative new initiatives, it’s a reality.

]]>
https://www.meritalk.com/a-golden-age-of-government-innovation-courtesy-of-the-cloud/feed/ 0
HCI – The Key to Unlocking Data Center Modernization? https://www.meritalk.com/hci-the-key-to-unlocking-data-center-modernization/ https://www.meritalk.com/hci-the-key-to-unlocking-data-center-modernization/#respond Mon, 09 Jul 2018 16:37:17 +0000 https://www.meritalk.com/?p=49170 The most recent FITARA scorecard – Scorecard 6.0 – is in, and the results aren’t pretty. Since the last scorecard in November 2017, 11 agencies’ grades have declined while only six showed improvement. The Department of Defense received its third straight “F,” while eight other agencies were perilously close to failing, with “D” grades.

But it’s not all bad news, as nine of the 24 rated agencies received either an “A” or a “B” on the Data Center Optimization Initiative (DCOI) portion of the scorecard, which examines agencies’ progress against savings goals, as well as performance in five areas: energy metering, power usage effectiveness, virtualization, server utilization/automation, and facility utilization. There were five “F’s”, but that’s a big step forward from the first scorecard, released in November 2015, when 15 agencies received failing grades.

How can agencies continue to progress? It’s a complex challenge, as they must find a way to bridge the gap between today’s aging, multi-component infrastructures and tomorrow’s modern future. Hyper-converged infrastructure (HCI) might provide a key.

Hyper-Speed to Hyper-Converged

HCI integrates compute, storage, networking, and virtualization technologies – helping agencies move away from siloed systems toward more consolidated, modern, and secure data center infrastructures. Not only does this approach allow agencies to achieve a smaller data center footprint, it also consolidates IT management to a single pane of glass, eliminating the need to manage multiple components within the data center. This is a key benefit in a market with an aging workforce and stiff competition with the private sector for the best IT talent.

HCI also provides true flexibility/scalability to expand only when needed (rather than requiring heavy investments in hardware that will go unused in the beginning of the project) enabling agencies to move from a reactive to a proactive stance by creating an environment with more repeatable and predictable outcomes.

Additionally, HCI adoption could accelerate other modernization efforts – such as automation, advanced analytics, and more – making it possible for federal agencies to truly transform their data centers, resulting in improved security, greater efficiency, and more opportunity for innovation.

For the first time, FITARA Scorecard 6.0 added a new category to evaluate agencies’ progress with the Modernizing Government Technology (MGT) Act, which was signed into law in December. MGT’s centralized revolving capital fund ($100 million allocated this fiscal year) allows agencies to compete for funding for their transformation initiatives. Even more significant, MGT enables agencies to establish their own working capital funds so they can reinvest savings generated in one year across three-year IT modernization initiatives. Previously, money saved in one year was money lost in the next year’s budget.

Agencies with an MGT-specific working capital fund with a CIO in charge of decision-making would receive an “A.” No one is there yet, but three agencies earned a “B” for demonstrating that their efforts to implement a working capital fund in 2018 or 2019 are sincere and in progress. As agencies set up these funds, they would be wise to funnel some of the money into HCI initiatives, and then dump the savings achieved from those programs back into the fund for future transformation initiatives. Our recent research with IDC has shown that organizations can achieve a 6x ROI over five years with HCI solutions.

Full Speed Ahead – A Team Effort

As agencies begin their journey toward HCI adoption, they may face some challenges or concerns around the network requirements. Some federal environments may have very strict, outdated networking guidelines, which can be too restrictive for quick, easy adoption.

To overcome this challenge, agencies must engage the networking teams from the very beginning of the process, and work together to identify any networking roadblocks before implementation. By identifying and addressing these issues at the onset, agencies lay the groundwork for successful adoption.

Early successful HCI implementations have centered around virtual desktop deployments, but agencies are just beginning to scratch the opportunity surface as they start to move mission-critical systems to HCI environments. Overall, HCI will vastly improve government IT efficiency – freeing up time that was previously spent configuring individual compute, storage, and network elements, as well as making sure all the pieces worked together, that can now be spent on strategic programs that directly drive innovation in the data center and beyond.

By: Jeremy Merrill, Global VxRail & VxRack SDDC Product Technologist, Dell EMC | Server and Infrastructure Systems

]]>
https://www.meritalk.com/hci-the-key-to-unlocking-data-center-modernization/feed/ 0
Modernizing Service Delivery with Multi-Cloud https://www.meritalk.com/modernizing-service-delivery-with-multi-cloud/ https://www.meritalk.com/modernizing-service-delivery-with-multi-cloud/#respond Mon, 25 Jun 2018 18:14:59 +0000 https://www.meritalk.com/?p=48716 FITARA, MGT, the latest IT Executive Order: all of these mandates underscore the same theme – our current approach to government IT isn’t working. Agencies are spending 80% of their IT budgets maintaining legacy systems. As data volumes skyrocket, cyber security threats proliferate, and employees and constituents demand near real-time access to information, this model is simply not sustainable.

As agency CIOs work to transition to a modern infrastructure that is more secure and efficient while reducing duplicative, costly IT systems, all roads seem to lead to the cloud. But, the journey isn’t always easy. Obviously, not all workloads are suitable for public cloud consumption. And, even for those that are, challenges abound in classifying, protecting, and migrating data. Most agencies are deploying multi-cloud environments – some combination of public, private, and hybrid clouds – to address their unique needs while still enabling them to benefit from improved information sharing, innovation, and “anytime, anywhere” access.

As agencies make the transition to a multi-cloud environment, here are three things they need to consider:

1) Is Our Infrastructure Modern and Ready?

As agencies transition to a multi-cloud approach, they must ensure the technology and underlying infrastructure in their current data centers is modern and ready for transformation. Agencies cannot layer cloud technology on top of an existing infrastructure that is not designed to work well in a cloud environment.

To get there, the underlying data center architecture must be cloud-native, scalable, resilient, and trusted, so agencies can deliver services to their users that are relevant and secure. In addition, it needs to seamlessly interoperate with and perform like public cloud offerings from Amazon, Microsoft, Google, and others, as well as be able to support new cloud aware applications and workloads.

Finally, agencies need to keep inevitable scale in mind. Infrastructure should be modular – so it can start small and scale at web speed. It should be built to support both legacy and newly developed applications, while fully integrating with the agency’s software-defined cloud management strategy.

2) How Can We Evaluate and Transform our Applications?

Agencies adopting a multi-cloud approach must match their mission critical workloads to the most optimal environment based on mission characteristics. Accomplishing this requires deep analysis and rationalization of current applications and the optimization or re-platforming of applications for their desired target environment.

By assessing the legacy environment, agencies can quickly determine which applications can immediately move to their ideal cloud environment so they can shift their focus to the larger roadmap of application migration efforts that will require new application development or a complete re-write. This rationalization process aligns with agencies’ data center consolidation efforts. They must right-size their on-premises cloud infrastructure as they plan their public or off-premises cloud strategy.

The second part of a successful application transformation requires the adoption of agile DevOps methodologies and platforms. Agencies should seek integrated, turnkey solutions that incorporate developer tools with engineered solutions built for multi-cloud, so they can develop and deploy agile, modern applications built to run in a multi-cloud environment, moving to an agile model of continuous code delivery.

3) Are Our Processes and People Up to the Task?

Finally, the most critical aspect agencies need to address is transforming and automating their current IT processes and service delivery.

As agencies move away from siloed IT services and toward a software-defined infrastructure that is more agile, they need more integration between teams. This requires identifying new roles and swim lanes, as well as developing new skill sets. Many multi-cloud service delivery efforts falter because of improper planning and implementation of new processes end-to-end – from the end-user to the IT stakeholders to the Cloud Management Platform.

Agencies can’t put new technologies and services in place to protect an old way of doing business; they must develop a new operating model that allows them to manage multiple infrastructure offerings as one single set of services. Success depends on an organization’s ability to plan with this future organizational state as the end goal, and to partner with IT delivery partners who can help them get there.

Multi-Cloud, Multiple Benefits

Today’s Federal Agencies, when evaluating suitable cloud solutions recognize there are a wide variety of capabilities available across the various cloud service providers. As a result, many agencies are adopting a multi-cloud approach to gain the flexibility to match workloads and applications to targeted cloud environments that will optimize performance and capacity and drive down costs. This is the greatest benefit of multi-cloud. By addressing the modernization of infrastructure, adopting sound application transformation strategies, and developing new ITSM processes, multi-cloud can provide greater visibility to what’s running where and what they need to effectively support their programs within required timeframes.

Agencies that adopt these multi-cloud models and effectively modernize infrastructure, transform people and processes that support multi-cloud IT models, and rationalize applications to retire, re-build, replace, or migrate their applications become more relevant to the missions they support. What used to take months can now be delivered in days and hours or even minutes, which truly delivers on the promise of transforming government.

By: Diane Santiago, Director, Federal Presales Engineering, Converged Platforms & Solutions Division, Dell EMC

]]>
https://www.meritalk.com/modernizing-service-delivery-with-multi-cloud/feed/ 0