MeriTalk https://www.meritalk.com Improving the Outcomes of Government IT Wed, 17 Jul 2019 15:32:59 +0000 en-US hourly 1 https://wordpress.org/?v=5.2.1 Why Agencies Should Make Zero Trust Their Mission https://www.meritalk.com/why-agencies-should-make-zero-trust-their-mission/ Wed, 17 Jul 2019 13:52:52 +0000 https://www.meritalk.com/?p=65632 By: Lisa Lorenzin, Director of Emerging Technology Solutions for the Americas, Zscaler

Federal CIOs will be working harder than ever to deploy cloud applications and infrastructure over the next year as they work to meet 2020 Data Center Optimization Initiative (DCOI) deadlines, continue to deploy shared services, and work to meet evolving mission requirements.

The cloud push brings new opportunities for flexibility and efficiency. But alongside this progress, federal cyber leaders need new cyber defenses to protect increasingly complex environments that now span multiple cloud providers in addition to existing data centers.

It’s not news that security concerns have stymied cloud progress. Furthermore, agencies are saddled with technical debt that makes innovation difficult and leads to a slower-than-expected cloud adoption. As a result, in 2019, 80 percent of the federal IT budget is spent supporting legacy systems rather than on driving innovation.

To accelerate cloud adoption, overcome technical debt, and support 21st-century missions and citizen services, agencies need flexible security solutions that provide a consistent user experience across both cloud and data center environments. Increasingly, federal agencies are considering a zero trust approach to help address these requirements.

Based on the idea that an organization should not inherently trust any user or network, zero trust helps agencies balance security and productivity. Under this model, any attempt to access a system or application is verified before the user is granted any level of access. Authorized users receive secure, fast access to private applications in the data center or cloud, regardless of whether the user is on-site or remote, an agency worker, or a third party.

Zero trust is ideal for federal agencies, given the need to protect data on a massive scale in an increasingly hybrid environment. The list of devices connected to an agency’s network continues to grow.  Also, agencies increasingly manage assets that are beyond their traditional network perimeter – effectively creating a larger attack surface. Considering the variety and sensitive nature of government data, and the criticality of federal missions, agencies clearly need an equivalent level of protection.

Connect the Right User to the Right Application

Zero trust prevents unauthorized users from accessing data and systems – but that’s only the beginning. The real goal is to get the right users connected to what they need to complete their mission as quickly and seamlessly as possible. Agencies that implement zero trust solutions can take advantage of four primary advantages: security, user experience, cost, and simplicity.

From a security standpoint, agencies need a solution that provides granular, context-based access to sensitive resources. With a zero trust solution, security can follow both the application and the user consistently across the organization.

While applications are hosted in multiple environments and users will connect from diverse locations, the user experience can be consistent and transparent. Users will not have to manage added complexity if they are off-network versus on-network, or if an application is hosted in the cloud versus a physical data center.

From a cost perspective, agencies need a solution that enables them to invest at an initial level to solve an initial use case, and then expand organically as the number of use cases grows. Unlike many traditional security models that rely on network-based controls, zero trust should not require a fixed investment – making it ideal for scalable, flexible cloud environments.

Finally, agencies need simplicity. Implementing a zero trust solution should make it easy for users and administrators to consistently access the information they need. Who is using which applications and how often? What is the user experience when accessing a specific application or when accessing from a particular location?

TIC 3.0 Changes the Game

The traditional security process for remote access in federal environments, as we know, is not optimal.  The agency establishes a security perimeter and deploys a virtual private network (VPN) to connect endpoints to the network when the user is outside that perimeter. Then the user connects to the agency data center through a stack of various infrastructure devices (DMZ firewalls, load balancers, etc.) supporting the VPN appliance. If users are accessing private applications hosted on public cloud providers, their traffic is routed back out through a Trusted Internet Connection (TIC), traversing another stack of security appliances before it finally arrives at its destination.

Federal CIO Suzette Kent released the updated TIC 3.0 policy in draft form this past year. These new guidelines are more flexible than previous TIC requirements – they open the door for agencies to use modern security solutions and models like zero trust to protect data and applications in cloud environments. This is a game changer. A FedRAMP-certified zero trust solution can provide modern security, usability, and flexibility – and meet the new TIC 3.0 guidelines.

Where from Here?

TIC 3.0 is expected to accelerate cloud adoption as it enables agencies to take advantage of modern security models like zero trust. There are several steps that can help ease the learning curve for federal teams.

First, consider your current infrastructure. Many agencies have elements of zero trust in place, such as endpoint management, Continuous Diagnostics and Mitigation (CDM), application and data categorization, micro-segmentation, and cloud monitoring.

Next, consider the application landscape. Zero trust is inherently complex to implement, but zero trust network access (ZTNA) solutions like Zscaler Private Access™ (ZPA™), a FedRAMP-authorized cloud-based service, can provide a scalable zero trust environment without placing a significant burden on the IT team. ZPA connects users to applications without placing them on the network or relying on an inbound listener, instead leveraging a global cloud platform to broker inside-out connections that carry authorized user traffic in using TLS-encrypted micro-tunnels. These tunnels provide seamless connectivity to any application regardless of where it’s running, creating a secure segment of one and ensuring apps remain invisible to the internet. The approach reduces the attack surface and eliminates the risk of lateral movement within the security perimeter.

Finally, take advantage of federal community resources. ACT-IAC just published the ACT-IAC Zero Trust White Paper, developed by a government/industry task force. The document shares key concepts around zero trust, recommended steps, and specific lessons learned working within federal environments. ACT-IAC recently hosted a panel discussion on zero trust among industry and agency technologists that explored these concepts at their recent IT Modernization Forum.

As the National Transportation Safety Board recently demonstrated, leveraging a zero trust approach now means agency teams will gain the ability to access and share mission-critical information quickly –anywhere, anytime, from any device. As agencies build cloud confidence, they can, finally, start to shift spending. This means less legacy, more innovation, and ultimately secure, modern government services built to deliver an experience that agency teams will appreciate.

]]>
Unlocking the Security Benefits of Multicloud and Network Modernization https://www.meritalk.com/unlocking-the-security-benefits-of-multicloud-and-network-modernization/ Tue, 02 Jul 2019 15:42:40 +0000 https://www.meritalk.com/?p=64819 By: Greg Fletcher, Business Development Director, Federal Civilian Agencies, Juniper Networks

The government’s modernization effort has evolved over time with the help of policy developments, increased funding and a cultural shift toward embracing technology. Federal leaders dedicated years to planning the impending digital transformation and now, agencies are beginning to leverage innovative forms of technology to reach their diverse mission goals.

Cloud adoption continues to play a critical role in this modernization effort for agencies, ranging from the U.S. Department of Homeland Security to U.S. Department of Defense. When looking to move their critical data, many agencies are turning to a hybrid multicloud environment, which enables data sets to live on-premise and in the cloud. Accomplishing a successful cloud adoption is no small feat – in fact, many agencies were tasked first with retrofitting the path in which this data moves from one environment to another – the network. There are many security benefits to modernizing federal networks and adopting a hybrid multicloud environment, but three key outcomes include:

Greater Visibility

With the enactment of the Modernizing Government Technology Act and the Cloud Smart Strategy, the federal government’s migration to the cloud is imminent. And yet, many agencies are still concerned that their data could be compromised when migrating sensitive information to public cloud environments. Legacy networks lack the sophistication that federal agencies need to monitor for suspicious activity and uncover nefarious threats. After all, federal agencies can’t mitigate security threats if they don’t know they exist.

Using a common operating system and a single, open orchestration platform, multicloud solutions help you manage the complexity of operating in different environments and can provide a methods-driven approach that lets agencies map their own path. They would also be able to operate with consistent policy and control across all places in the network, with support to launch workloads on any cloud and in any server across a multivendor environment. By adopting unified and integrated networks across the public cloud and on-premise IT infrastructure, federal agencies can achieve greater visibility and therefore, seamlessly determine if there are holes in their security posture or if unauthorized devices are accessing the network.

Faster Response Times

It takes as little as a few seconds for a cyberattack to occur – but the aftermath can cost millions and take years to overcome. Federal agencies hold the keys to citizens’ most critical data, whether it’s their social security information or health insurance. For this reason, it’s imperative that this data always remains secure and agencies can mitigate potential threats quickly.

However, when it comes to agencies that haven’t modernized, agility can be a pain point, simply because older networks are prone to latency and jamming if too many devices and bandwidth-intensive applications are running at the same time. As federal agencies begin to migrate some of their data to the public cloud, they can facilitate the migration and optimize the future-state multicloud environment by deploying software defined wide-area networks (SD-WAN) and advanced threat protection (ATP) software that not only transport bandwidth intensive data between public cloud environments and on-premise IT infrastructure quickly and securely, but also respond to suspicious activity immediately.

Lays Foundation for Adoption of Emerging Tech

Technology plays a central role in the administration’s mission to achieve a 21st century government. Most recently, the Alliance for Digital Innovation released a report, which found that the federal government could have reduced IT spending by $345 billion over the last 25 years if it invested in more commercially available solutions as opposed to architecting systems itself.

The high cost of custom and proprietary developed IT leaves federal agencies with limited resources to ensure the security of their technology platforms, networks and applications. By modernizing their networks using state of the art, commercially available items, government agencies can reduce operation and maintenance costs. In addition, a modern best-of-breed network can support secure, cloud-based applications and other forms of cutting-edge technology, such as drones, artificial intelligence, augmented reality and virtual reality, all of which can enable government to meet its modern-day mission goals.

The administration released the “Executive Order on Strengthening Cybersecurity of Federal Networks and Critical Infrastructure”  two years ago, and there is still much work to be done when it comes to federal agencies modernizing their networks. By overhauling its networks, government addresses a key element to successfully migrate to a hybrid multicloud environment realizing the agility, security, and cost benefits it offers.

 

]]>
TIC 3.0 Will Remove a Significant Cloud Barrier https://www.meritalk.com/tic-3-0-will-remove-a-significant-cloud-barrier/ Fri, 28 Jun 2019 12:00:39 +0000 https://www.meritalk.com/?p=64631 By: Stephen Kovac, Vice President of Global Government and Compliance at Zscaler

The Office of Management and Budget in coordination with the Department of Homeland Security recently proposed an update to the Trusted Internet Connections (TIC) policy: TIC 3.0. Still in draft form, TIC 3.0, proposes increased cloud security flexibility for federal agencies, and the opportunity to use modern security capabilities to meet the spirit and intent of the original TIC policy.

During MeriTalk’s Cloud Computing Brainstorm Conference, I had the opportunity to present a session with Sean Connelly, Senior Cybersecurity Architect, CISA, DHS – or as I like to call him “Mr. TIC.” We discussed how the revised TIC 3.0 policy will remove cloud barriers and accelerate Federal cloud transformation. Connelly, who has been with DHS for the last 6 years, helped lead the TIC initiative, including recent updates to TIC 3.0.

Challenges for TIC in today’s environment

Connelly first explained that the policy originated in 2007 as a way for OMB to determine how many external connections were being used by Federal networks. The number of connections was “eye-opening” – and, OMB found the security surrounding these connections wasn’t consistent, even within the same agency. The original policy required external connections to run through the TIC with a standard set of firewalls to give agencies baseline security. But today, as the number of mobile devices and cloud adoption expands, the perimeter is dissolving. This evolving landscape makes it difficult for agencies to determine what connections are internal or external to their network.

Where do we go from here?

When I asked Connelly how TIC 3.0 will modernize internet security, he echoed Federal CIO Suzette Kent by saying “flexibility and choice”. Instead of having two choices – internal or external – TIC 3.0 allows three different choices: low, medium, and high trust zones. He said, “it changes the game entirely.” Agencies now have a responsibility to determine the appropriate trust zone for their networks.

Connelly added, “If you look at today’s environment, you’ve gone from fixed assets and desktops – and now you have mobile assets, mobile devices, and pretty soon the platform is not even going to matter… so we have to make sure the policy and reference architecture can support all three models going forward.”

Catalog of use cases

One important aspect of the draft TIC 3.0 policy is the addition of use cases that encourage moving TIC functions away from perimeter-based, single-tenant appliances to a multi-tenant, cloud service model. As agencies develop TIC 3.0 solutions, it is vital they share them across government, providing other IT leaders the opportunity to compare their security requirements, review the viable and tested options, and avoid reinventing the wheel.

Connelly shared that the use cases will come out on a consistent basis and will result in a “catalog approach to use cases.” Agencies can propose pilot programs through the Federal CISO Council; then DHS and OMB will work with the agencies on their pilots. The pilot programs will provide agencies with the use case examples and lessons learned.

When can we expect the final policy?

The final TIC 3.0 policy will be issued later this year. Connelly confirmed the final policy will look “very similar” to the draft policy.

Increased cloud adoption across the federal space will lay the foundation for emerging technology, shared services, and the ability to meet the expectations of a federal workforce that wants simple, seamless access to applications and data.

TIC 3.0 is an important step forward to expand cloud security options and remove a significant cloud barrier. With these flexible new guidelines, we should see accelerated cloud adoption in government. I’m excited to see the innovation ahead.

]]>
Is Your AI Program Stalled? Consider “as-a-Service” https://www.meritalk.com/is-your-ai-program-stalled-consider-as-a-service/ Tue, 09 Apr 2019 20:43:26 +0000 https://www.meritalk.com/?p=61002 Two months ago, the President signed an executive order to accelerate the research and development of artificial intelligence tools in government. It laid out six strategic goals for federal agencies:

  1. Promote “sustained investment” in AI R&D with industry, academic and international partners.
  2. Improve access to “high-quality and fully traceable federal data.”
  3. Reduce the barriers to greater AI adoption.
  4. Ensure cybersecurity standards to “minimize vulnerability to attacks from malicious actors.”
  5. Train “the next generation” of American AI researchers.
  6. Develop a national action plan to “protect the advantage” of the United States in AI.

While the order does not provide for additional funding, Agency heads were directed to set aside funding in their 2020 budget. Without additional funds, these AI programs will compete for budget allocation with operational and maintenance requirements, data center modernization and other infrastructure requirements. One option to mitigate the financial pressure while forwarding an AI strategy is to look to an as-a-Service model. By 2023, the AI as-a-Service market is expected to grow at a Compound Annual Growth Rate (CAGR) of 48.2% going from $1.5 Billion in 2018 to $10.8 Billion in 2023. (Markets and Markets Press Release)

AI as-a-Service can mean many different things. On one hand, cloud-based APIs are playing a role in helping organizations speed up their AI programs to analyze data and add different application features based upon their requirements. On the other end of the spectrum, AIaaS can mean AI ready infrastructure delivered in a consumption-based model, where hardware and software are consumed through a OpEx funding model. It could be a fully managed service, or the agency could choose to manage it themselves. They may also opt for professional services with data scientists and analysts available on a contract basis, or they can leverage internal expertise and resources. The balance of how an agency uses as-a-service is unique to their requirements, but it should be a consideration as they develop a strategy to ensure a successful AI implementation.

Information Age identifies 7 Steps to a successful AI implementation which include:

  1. Clearly define a use case.
  2. Verify the availability of data.
  3. Carry out basic data exploration.
  4. Define a model-building methodology.
  5. Define a model-validation methodology.
  6. Automation and production rollout.
  7. Continue to update the model.

While we typically associate AI with technology, for a successful deployment, it is equal parts planning, process and infrastructure. As agencies move forward with their AI plans, they should consider what phases of this process can be as-a-service to reduce their burden for budget, expertise, resources and technology. This not only can alleviate financial pressure, but accelerate the speed to value and provide an agile acquisition model to pivot when necessary. When coupled with a marketplace that enables fast acquisition of technology and an easy resource to manage the as-a-service environment from end-to-end, agencies will be well positioned to deliver against AI mandates, while maintaining their existing IT infrastructure.

To learn more about how -aaS options can jumpstart agency AI progress, download the “AI and the Infrastructure Gap” Infographic.

Scott Aukema is Director of Solutions Marketing at ViON Corporation with 15 years of experience supporting public sector commercial, and enterprise segments.

]]>
The Next Computing Wave: Ultra Powerful, Ultra Accelerated, Ultra Connected https://www.meritalk.com/the-next-computing-wave-ultra-powerful-ultra-accelerated-ultra-connected/ Mon, 25 Mar 2019 14:01:46 +0000 https://www.meritalk.com/?p=60328 Never, in human history, have we seen this much technological change in such a short period of time. As technology grows more powerful, every facet of society has raced to adapt. It is exciting (and perhaps a bit daunting) that approaching advancements will probably hit faster, and may be even more dramatic, than changes that came before. More specifically, several emerging technologies – Bluetooth 5, 5G, WiFi 6, and quantum computing – are poised to profoundly change our lives. They are all wireless technologies, so our wearable and carryable devices will generate and process more data much faster and help us navigate our paths more effectively than before.

While it’s still early in deployment, 5G will deliver unprecedented wireless access speeds. Case in point: It will take less than four seconds on a 5G network to download a two-hour movie, while it takes five or six minutes on a 4G network today. With that much speed and power at our fingertips, it’s difficult to fully anticipate how we will use it, but its impact will be striking, and it will undoubtedly play a critical role in emerging businesses and applications, like self-driving cars, augmented reality, or the Internet of Things (IoT).

Another breakthrough technology in the wireless arena is WiFi 6 (or 802.11ax). It’s coming soon, and it’s built for IoT. It will connect many, many more people to mobile devices, household appliances, or public utilities, such as the power grid and traffic lights. The transfer rates with WiFi 6 are expected to improve anywhere from four times to 10 times current speeds, with a lower power draw, i.e. while using less electricity. The benefits may not be explosive as 5G, but its impact will be consequential.

In the hardware space, quantum computers could revolutionize everything from encryption to medicine. It’s hard to remember a time in recent history when quantum computing wasn’t considered a distant dream. After decades of research, though, we may finally see actual benefits of quantum computing. It won’t happen tomorrow – in fact, it could still be a few years off – but when it hits, it will fundamentally change how we use technology, possibly spawning new industries we couldn’t even conceive of in the 20th century. And… we’re actually testing it now, several years before we thought we could use of it.

One key thing to note about this new technological terrain – it wouldn’t be possible without the cloud. The network revolution mentioned above is (simply put) built to handle the supernova explosion of Internet of Things (IoT) devices.  These devices (aka sensors) are going to create and store massive amounts of data into the cloud – all the time. The flexibility of the cloud allows service providers and developers at home and in enterprises to modify applications in near-real time. In fact, almost all AI-based applications or machine learning programs will be built in the cloud, including the wireless apps used in retail, manufacturing, transportation, and more. Two key accelerator techniques are serverless computing and edge computing. We will cover these topics in depth in a future communication.

At NASA JPL, we expect the new wave of computing technologies to have a materially positive effect on our work. Here are but a few examples: Use of GPUs have sped processing up to 30 times faster. We use machine learning to create soil moisture data models, which is key for crop forecasts. Deep learning helps us detect spacecraft anomalies. AI has proven effective in use cases as diverse as from defending the network from attacks to finding interesting rocks on Mars. IoT helps us measure particles in clean rooms, improve safety, reduce energy usage, control conference rooms, and much more. The cloud reduces the cost – but increases the speed – of experimentation, encourages innovation, and provides the flexibility that allows us to pivot when we fail. We are accelerating the speed of processing by “using high performance computing and specialized ultra-efficient processors such as GPUs, TPUs, FPGAs, quantum computers, and more, all in the cloud. This increased pace of innovation is necessary as the future is barreling at us at interstellar speed, but as we have our work firmly planted in the cloud, we are eagerly looking forward to it.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

 

]]>
Why Cyber Security and Cloud Computing Personnel Should Be BFFs https://www.meritalk.com/why-cyber-security-and-cloud-computing-personnel-should-be-bffs/ Tue, 12 Feb 2019 16:00:11 +0000 https://www.meritalk.com/?p=58566 Keeping up with hackers is no small task, particularly since some attacks can be sustained in near-perpetuity. No doubt, securing cyberspace is a daunting responsibility. But focusing on security and using emerging technologies can help us meet the challenges. In particular, when the cyber security personnel of cloud providers and cloud customers work closely together, they become a force-multiplier, proactively defending the enterprise from attacks, and reacting more quickly when breaches occur.

Thanks to advancements in artificial intelligence (AI) and supercomputing, federal networks could soon be entirely self-fortifying and self-healing. The goal is to use AI-driven algorithms to autonomously detect hackers, defend the network from cyber-attacks, and automatically respond at lightning speed to an assault. Until that day, though, we rely on a handful of tools to defend our networks from phishing, vishing, or Distributed Denial of Service (DDoS) attacks.

The burden of securing a network and the devices no longer falls exclusively on systems administrators or security professionals but requires the active participation of every single network user. Continuous user training, especially role-based training, in which individuals receive customized security courses based on their specific tasks, is a good supplementary defense. A senior or C-level executive, for example, may need to be trained to identify suspicious emails that contain fund transfer requests; IT professionals may receive more technical training about different types of attacks and possible responses.

Role-based training isn’t enough in itself, though. We’re exploring how to incorporate emerging technologies into our overall cyber security strategy. Default encryption, in which data is encrypted by default during transit and at rest, is now possible (if not commonplace) in the cloud. We are now calling on all industry innovators to advance the industry so that memory can be encrypted and protected as well. In fact, by partnering our own cyber security teams with those of the cloud providers, we can significantly advance our cyber security defense. So, how do we combat all the scary bot attacks we hear about? Well, why not use the intelligence and data gathered from the Internet of Things (IoT) sensors and the power of AI and the cloud to predict, act, and protect our assets? Why not, indeed.

Blockchain, an incorruptible distributed-verification ledger, could be useful to ensure that the data has not been tampered with. While we’re taking a wait-and-see approach to blockchain, the potential is rich and the technology is developing at a tremendous pace. Several new database announcements from the cloud vendors may help provide many of these benefits and are worth investigating.

There are countless budding technologies that may become integral to our cyber security infrastructure in the future – everything from biometrics to quantum computing. The latter could have huge implications on both encryption and decryption. But every tool we use today requires that we have the agility and ability to move resources and experiment at a moment’s notice. Securing the network isn’t just a matter of protecting corporate secrets – in the case of federal organizations, it’s a matter of defending national interests. The cloud provides the computing power and scalability to secure our most valuable assets. We now have to step up to the plate to build in cyber security into everything we do – because when the easiest path for the end user is also the most secure, we’re on the right one.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

 

]]>
Cheaper, Faster, Smarter: Welcome to the Age of Software Defined Everything https://www.meritalk.com/cheaper-faster-smarter-welcome-to-the-age-of-software-defined-everything/ Tue, 11 Dec 2018 15:59:18 +0000 https://www.meritalk.com/?p=55295 The benefits of Software Defined Everything (SDE) – in which a physical infrastructure is virtualized and delivered as services on demand – will change the way we design, build, and test new applications. In a virtual environment, operational costs drop precipitously as the pace of innovation accelerates. Containerization – a way to virtually run separate applications without dedicating specific machines for each application – allows organizations to build, fail, test, repair, improve, and deploy new apps at a breakneck pace. Application Programming Interfaces (APIs) allow us to build and reuse other peoples software packages quickly and inexpensively.

What’s the growth path for programmers in the US? I’ve seen many numbers and predictions, and I think most are too conservative. Why? Because the nature of “programming” is expanding. JPL, NASA, Facebook, Uber, etc., all employ traditional software engineers. And we will still need even more. But what about all those who program Alexa or Google Home, or program their 3D printers, or their home automation systems, or compose music on their computers? They are using software to define their environment and that’s the essense of SDE, where everything becomes programmable or configurable.

If programming will become democratized, what will be the most popular languages? Today, Python seems to be growing the fastest. Tomorrow, I think it’ll be APIs that will prove the most valuable because they can access pre-written software libraries and change their own environment in minutes. This assumes that people can get to the underlying data and combine it with other data. The cloud will come to the rescue as most data will already be stored there. The companies that are able to take advantage of this agilty – and use data from inside and outside their organizations – will quickly establish a competitive advantage.

If an environment can be accessed through software, it can be automated through artificial intelligence (AI). The combination of: data available in the cloud; accessible APIs; Internet of Things; automation skills; AI skills; open source code available; and distributed serverless computing to inexpensively execute the automation will speed progress towards a self-configuring and self-healing environment for the companies that will become the leaders.

SDE, for example, is one of the most technologically revolutionary things we’re adopting as it promises to dramatically decrease the cost and increase the agility of dealing with our networks. The cloud enables these changes, in part, because capacity and bandwidth can be sliced on the fly. It’s a departure from what we see today, where network engineers set up a physical infrastructure that can be narrowly used by specific people for specific purposes.

SDE touches upon every aspect of an organization’s technology strategy, ranging from security to storage. It will allow systems administrators to view and manage an entire network over a single screen. Increasingly, it also allows networks to run themselves, particularly in cases where there is simply not enough manpower for the job. Netflix, for example, found, in at least one region, it was making a production change every 1.1 or 1.2 seconds. The solution? The company built a self-healing network that automatically monitors production environments and makes real-time operational decisions based on problems that are identified through AI.

The move toward SDE also means that proprietary technology – such as application programming interfaces (APIs) – will become differentiating factors for businesses. In fact, it will become so important, we may see programming taught in schools, along with reading, writing, and arithmetic. Those coding skills could be used to customize anything, from home appliances to self-driving cars.

It’s reasonable to assume that in the not-too-distant future, federal agencies will run software-defined networks that are auto-scaling, self-fortifying, and self-healing. As precious resources are freed up, government organizations can better solve real-life problems, closer to agencies’ missions, and further away from the labor-intensive effort of maintaining a physical infrastructure. And, of course, it doesn’t hurt that it will save the government millions of dollars in the process.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

]]>
How Can We Benefit from Changing Work Habits? https://www.meritalk.com/how-can-we-benefit-from-changing-work-habits/ Tue, 27 Nov 2018 15:03:30 +0000 https://www.meritalk.com/?p=54892 It’s hard to know for certain what the U.S. economy will look like in 10 years’ time, much less what the work day will look like. One thing we do know: As younger generations enter the workforce, they will bring new habits, technology, conventions, and expectations that will likely transform business and government.

With the rise of the “gig economy,” (that is, the rise of independent contractors who work on a task-by-task basis) younger generations already expect a certain amount of freedom and autonomy in their careers. And that’s probably a reasonable expectation, given that they walk around with supercomputers in their pockets, just a click or voice command away from talking with anyone – or controlling equipment – at any time.

Just as millennials eschew traditionally wired conventions, such as cable TV and fixed phone lines, but embrace the sharing economy, such as ridesharing, and open work spaces, the next generation of workers may not value the same things that people want today. They may not be motivated by a base salary, corner office, or a lofty title.

Businesses may also want to provide customized work experiences for their employees in the not-too distant future. Instead of an email notification for an upcoming meeting, an intelligent digital assistant may remind an employee of the meeting and automatically book transportation to the meeting based on the individual’s schedule, traffic conditions, and so on. While this already exists, the scope will be greatly expanded.

Gamification strategies could play a key part in helping businesses better understand what drives its employees. When we talk about gamification, we’re not talking about turning everyday tasks into cartoonish video games, but rather figuring out how to keep employees engaged. This includes setting up structures that allow them to compete against themselves or others (individually or in teams) in the completion of certain tasks or training exercises. They will likely use open spaces and Internet of Things (IoT) with wearables, augmented reality, and in-the-room sensors to collect data. The outcomes or effectiveness will be measured and communicated in real time using analytics employing massive amounts of data from both inside and outside the organization, all stored in the cloud. JPL has already tested this with scientists and operators navigating the Mars rovers, and it has proved to be very effective.

At JPL, we’re trying to meet the changing needs of the business by benefitting from these trends. For example, to quickly and inexpensively help solve a problem of finding parking at JPL, we held a month-long hackathon where teams of interns created prototype mobile phone solutions. There were two winning teams and their ideas were incorporated into a mobile solution now used every day by JPLers. As you approach JPL, your mobile phone will speak and tell you where there is parking available. You can also look at historical data to know what time to leave home or when to go to lunch so you can find parking when you arrive. The data is also used to predict parking during public events.

Other examples include interacting with intelligent digital assistants for many things, including finding conference rooms, hearing what’s happing on Mars, or when the International Space Station will be overhead.

As we approach what will likely be a rich and exciting decade, the most important thing organizations can do right now is lay the groundwork for change in two areas.

Technologically speaking, that means embracing cloud computing, IoT, and the wireless network improvement waves that are rapidly approaching. In particular, serverless computing is cost-effective and it allows organizations to innovate and to experiment with things like artificial intelligence or augmented reality. Edge computing allows business to employ these capabilities at huge scale and speed, which will further solidify the real time gamification and wireless communication future that our next generation will expect.

Humanologically speaking (yes, I know that’s not a word, but perhaps it should be), we can set up innovation labs and experiment in our own environments to quickly decide what to abandon and where to double down. From our experience so far, you will find willing participants in the new workforce and experiments in a safe, protected environment will serve as training opportunities and quickly evolve to produce lasting positive outcomes. And, in case you wonder, yes, it’s fun.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

]]>
How NASA Is Using Ubiquitous Computing to Bring Mars to Earth https://www.meritalk.com/how-nasa-is-using-ubiquitous-computing-to-bring-mars-to-earth/ Tue, 30 Oct 2018 15:00:43 +0000 https://www.meritalk.com/?p=54061 When the term “ubiquitous computing” was first coined in the late 1980s, it was envisioned as a sea of electronic devices “connected by wires, radio waves, and infrared,” that are so widely used, nobody notices their presence. By that definition, the era of ubiquitous computing arrived some time ago. Everything from movie theater projectors to plumbing systems have gone digital, paving the way to meaningful, data-driven change in our daily lives.

A lot of people remain skeptical of the Internet of Things (IoT). They may not realize that they already depend upon it. While IoT often conjures images of futuristic devices – senselessly superpowered refrigerators or embarrassingly smart toilets – there are good reasons why Americans are buying thousands of programmable appliances every month. Smart devices provide an improved service to consumers, and the data we can collect from them are invaluable, with the potential to revolutionize everything from traffic management to agriculture.

It also means that the idea of a computer – as some sort of electronic box typically used at a desk – is woefully outdated. In fact, the new super computer is the one we have in our pockets, the SmartPhone, which is rapidly becoming our ever-present intelligent digital assistant. Ubiquitous computing is changing the way we interact with everyday objects. We don’t flip on light switches, we instruct a digital assistant to turn our lights on for us; or, in some cases, we just push a button on our phones to turn a light on. If a smoke alarm goes off, we don’t need to get out the ladder and manually turn it off – we can see the source of the smoke on our phones and turn the alarm off with a mere swipe.

The Curiosity rover, the most technologically advanced rover ever built, has 17 cameras and a robotic arm with a suite of specialized lab tools and instruments. It required the work of 7,000 people, nationwide, to design and construct it over the course of five years.

At JPL and NASA, one of countless ways ubiquitous computing has woven its way into our work is through augmented reality (AR). Today, if anyone wants an up-close look at Curiosity, they need only use their phones or a pair of smart goggles. JPL built an augmented reality app that allows you to bring Curiosity anywhere – into your own home, conference room, or even backyard. The app lets you walk around the rover and examine it from any angle, as if it were actually there. In addition, scientists can don AR glasses and meet on the surface of Mars to discuss rocks or features – all from their own homes or conference rooms scattered across Earth.

AR may feel like magic to the end user, but it’s not. It’s the culmination of decades of technological advancements. It requires an assortment of sensors (light and motion), cameras, as well as substantial data processing power – power that only became affordable and available via mobile devices in recent years. In fact, we are now seeing the initial swells of a major ubiquitous computing wave hitting our shores within the next few years. The entire wireless networking industry is being revolutionized to meet the needs of exponentially more devices communicating to each other and to us all the time. That’s when we all become IoT magicians in our daily lives and when that second brain (the SmartPhone) fires on all neurons. (More about that in a future blog.)

Now we can use AR for more than just finding Pokemon in the wild – we use it to review and build spacecraft. We can get a detailed look at a vehicle’s hardware without actually taking it apart, or we can see if a hand might fit in a tight space to manually turn a screw. Among the many advantages of augmented reality: It’s cost-efficient for multiple people to work together over a virtual network and it could easily be used for hands-free safety training, testing, or maintenance.

While AR is dependent on a slew of technologies, perhaps the most critical piece is the cloud. A lot of AR applications would be cost prohibitive without the supercomputing power available over the cloud. Based on our experience at JPL, we estimate that serverless computing can be up to 100 times less expensive than N-tier server-based computing. Not surprisingly, we’re now starting to use serverless computing as often as we can. What it really means is that we don’t have to worry about how a problem is solved, we just have to worry about what problems we’re solving. And that’s a powerful position to be in.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

]]>
The Rise of Intelligent Digital Assistants https://www.meritalk.com/the-rise-of-intelligent-digital-assistants/ Tue, 09 Oct 2018 15:27:47 +0000 https://www.meritalk.com/?p=53457 Before we can have a rational discussion about artificial intelligence (AI), we should probably cut through the hysteria surrounding it. Contrary to popular opinion, we’re not talking about machines that are going to rise up, steal our jobs, and render humans obsolete.

We already rely on AI in countless daily tasks. The AI embedded in your email, for example, identifies and blocks spam; it reminds you to include an attachment if you’ve forgotten it; it learns which sorts of emails are most important to you and it pesters you to respond to an email that’s been neglected in your inbox for a few days. Chatbots, robot vacuums, movie recommendation engines, and self-driving cars already have AI built into them. Right now, its impact may be subtle. But, in the future, AI could save lives by, for example, detecting cancer much earlier so treatments can be that much more effective.

Why is AI exploding right now? A unique mix of conditions makes this the right time to surf the AI technology wave. They include the availability of supercomputing power, venture capital, oceans of data, open source software, and programming skills. In this sort of environment, you can train any number of algorithms, prep them and deploy them on smart computers in a very short time. The implications for most major industries – everything from medicine to transportation – are tremendous.

At NASA JPL, we’ve integrated AI into our mission in a variety of ways. We use it to help find interesting features on Mars, predicting maintenance of our antennas, and finding expected problems with the spacecraft. We’ve also come to depend upon intelligent digital assistants in our day-to-day operations. After a brainstorming session about how to experiment with applied intelligent assistance, we decided to throw AI on a mundane – but time-consuming – daily challenge: Finding an available conference room. Later that night, we built a chatbot.

Chatbots are fast, easy to build, and they deliver value almost immediately. We build ours to be easy to access with natural user interfaces. We then add the deeper (and more complex) AI on the back end, so the chatbots get smarter and smarter over time. You can text to them, or type to them, or speak to them through Amazon Alexa or Lex, and we collect user feedback to constantly improve them. Every “thumbs up” or “thumbs down” helps improve the next iteration. We can mine the thumbs down to see which areas need the most work. We now have upwards of 10 emerging chatbots used for functions like acquisitions, AV controls, and even human resources. By thinking of this as a “system of intelligence,” we can extend the life of – and get more value from – the legacy systems by teaching them how to respond to deeper questions. While applied artificial intelligence can conquer any number of menial tasks, it’s bound to have a significant effect on some of our bigger challenges, such as in space exploration, creating new materials to be 3D printed, medicine, manufacturing, security, and education.

AI has especially rich potential in the federal government, where one small operational improvement can have an exponential effect. If you work in a large agency and you’re unsure of how to approach AI, you can start playing in the cloud with any number of cloud-based applications (such as TensorFlow and SageMaker). Chatbots are a natural starting point – they deliver value right away, and the longer you use them, the smarter and more effective they become. In the cloud, experimentation is inexpensive and somewhat effortless. The goal, after all, is to have AI work for you, not the other way around.

Tom Soderstrom is the IT Chief Technology and Innovation Officer at the Jet Propulsion Laboratory (JPL). He leads a collaborative, practical, and hands-on approach with JPL and industry to investigate and rapidly infuse emerging IT technology trends that are relevant to JPL, NASA, and enterprises.

 

]]>