Agencies Automate to Digest the Data Deluge

The Federal Data Strategy will bring important changes to how Federal agencies manage and use data, evaluating its value as a strategic asset. To learn more about how the strategy is changing the Federal data landscape, MeriTalk connected with Mark D’Alessandro, Senior Director, Data Center Sales, Dell Technologies, and Kurt Steege, Chief Technology Officer, ThunderCat Technology.

MeriTalk: One of the milestone priorities for 2020 is to assess data and related infrastructure maturity. What areas do you think will emerge as priorities as a result of this audit?

Mark D’Alessandro, Senior Director, Data Center Sales, Dell Technologies

Mark: Federal agencies have more data than any company. Since leveraging that data extensively is relatively new for agencies, we need to start thinking about our priorities, challenges, and solutions differently.

One aspect that will be both a priority and a challenge is collecting data from sensor devices. Agencies can analyze some of that data at the edge, but quickly migrating the majority of data from the edge to data centers is a herculean task. Aided by 5G and other emerging technology, agencies will need to upgrade network infrastructure in order to handle the resulting traffic. With this in mind, automating the infrastructure is a must.

Kurt Steege, Chief Technology Officer, ThunderCat Technology

Kurt: Speaking of the amount of data and speed at which we will need to process it, I think back to last fall at a high-performance computing conference in Denver. There were plans for a cosmology system that could generate 157 terabytes of data per second, creating 18 petabyte data cubes for further processing. There’s so much information, and it’s causing some angst.

The Federal Data Strategy asks agencies to perform an initial assessment of data and their capability to use that data by September 2020. Like Mark said, to advance their capabilities, agencies will need to incorporate automation and learning behaviors throughout their infrastructure, becoming more adaptive. Operationally within the data center, infrastructure will need the capacity to surge in both capacity and power. These advancements will allow AI/deep learning to perform automated analysis, and, ultimately, actionize the information.

MeriTalk: What strategies can agencies adopt to address these maturity issues?

Mark: Before you can start digital transformation, you need IT transformation. Many of our digital challenges are rooted in our legacy infrastructure and mindset. There are three “must haves” to accomplish IT transformation:

  • Modernizing infrastructure: hyper-converged solutions, compared with traditional three-tier architectures, lend themselves to automated infrastructures;
  • Automation: with a hyper-converged solution in place, agencies can shift focus to provisioning, orchestrating, and automating the decisions that would have previously required an IT administrator; and
  • Workforce transformation: this is, by far, the hardest. Agencies will need to rescale and retrain, often pushing workers outside of their comfort zones. The key will be to express how these changes will bring value to their careers, creating new opportunities for data analysts, scientists, and programmers.

From there, agencies can begin to categorize data from a value perspective, both internally and externally, keeping in mind how the data might benefit someone outside of their agency.

Kurt: Once an agency has modernized the IT foundation and assessed the data, it can start to channel a collaborative DevSecOps workflow to rapidly develop applications that will use automation to leverage the data. This method encourages understanding of underlying operational infrastructure and helps teams avoid creating applications in vacuums – both within and outside of an agency.

I think Mark and I have covered the proverbial three-legged stool – people, process, and tools. The tools alone, which are a primary focus for many, aren’t enough. Those tools need to be linked back to the data strategy – the process – and the people need to understand how they work.

MeriTalk: The idea of the democratization of data is vital to optimizing its value and impact across the Federal government. What progress have you seen on this front in the last five years? And what sticking points remain?

Mark: Some agencies are still cautious of sharing sensitive data, but we’ve made progress in the last five years, recognizing that data sharing is the foundation for data democratization.

One of the sticking points is that some agencies migrated the wrong workloads to public cloud during the Cloud First initiative. We’re getting better with Cloud Smart, but the initial migration created two issues:

  • Data locality for certain workloads matters. Migrating the wrong workloads to public cloud creates latency. In the age of digital natives and wanting things immediately, any delay is a problem
  • The cost increase was more than anticipated. Then, application repatriation taxes and complexity made moving an application and data back on-premises a challenge

We’ve advanced significantly since Cloud First. We know that we can’t assume every agency or every IT person is facing the same challenges. So, first we want to understand what they know and what they really need.

Kurt: In my opinion, information safety and sharing is actually the key product of the government. It’s important to acknowledge that the same data can be used in a multitude of different ways to accomplish the same thing. In this vein, it is important for groups and agencies to base their usage of data on personas. As agencies mature their data strategy, ask, “What types of people will use this information?” It could be an average citizen; it could be a scientist; it could be an operator. They’re all going to need access to the same type of information, but in very different ways.

MeriTalk: Back-ups have long been the bedrock for IT disaster preparation and recovery strategies, but often data is excruciatingly slow to restore. This reality does not align with the always-on requirements of an increasingly digital Federal government. What can agencies do to accelerate rapid restore capabilities?

Kurt: It is a big part of the overall data strategy: understand where the data is stored, how important it is, how it needs to be presented, and how can we be sure it’s secure. The solution is going to be different depending on those four aspects. The key is the idea of enhanced metadata – understanding all of the bits and information about the data itself will help you to determine what that strategy is.

Mark: In terms of specific capabilities to look for, I can list off a few:

  • Cloud-enabled solution to backup and restore to and from many clouds;
  • Real data protection;
  • De-dupe guarantees;
  • Native application integration to help with speed;
  • Network needs for where you are and where you will be in the future;
  • Scalability to accommodate data growth;
  • Backup at flash speed and capability to handle multiple backup streams;
  • Ability to replicate within a cloud region to minimize downtime; and
  • Application direct workflows for faster backup of mission critical applications.

MeriTalk: In June, the House Oversight and Reform Committee will release the 10th FITARA scorecard. How has FITARA helped to improve Federal IT – and data center – efficiency? How could it be improved?

Mark: The FITARA Scorecard has been really effective in removing the IT silos. In the old days, we had our server person, our storage person, and our networking person. It’s pretty clear that it’s just not a workable model with hyper-converged technologies.

I think FITARA could evolve to show the positive perspective, highlighting cost avoidance by departments who have modernized and are increasing application time to mission.

Kurt: As far as potential improvements, I think the scorecard could evolve to include AI requirements or DevSecOps measurements, because both are necessary for folks to get agencies to achieve their those Bs and As on the scorecard.

MeriTalk: What trends are you seeing in the Federal data center ecosystem? And, how are you helping Federal agencies to navigate these changing landscapes?

Mark: Overall, becoming more responsive to a digital native citizen is paramount. To do so, government will need to build AI and deep learning capabilities, enabling automation that allows citizens to easily interact at the edge.

We at Dell Technologies know that we can’t accomplish government digital transformation by ourselves, and we have a strong ecosystem of Federal systems integrators and partners, such as ThunderCat, so we can scale solutions as needs grow. Working together, we’re able to provide fully integrated platforms, such as VMware Cloud Foundation on VXrail, a hyper-converged infrastructure.

From there, we help agencies get easy wins by figuring out the right workloads for public cloud versus on-premises. We are here to help agencies understand the cost of digitizing records, implementing new infrastructure, migrating applications, and more, so they can have an accurate ROI estimate.

Kurt: ThunderCat has always taken the advisory approach to all of our engagements with Federal agencies. We have formed practices for cybersecurity, cloud, and DevSecOps to help agencies streamline cloud management, mitigate risk, and evaluate applications. All of this is a way of helping FSIs and agencies get the right outcome for immediate and future needs.

MeriTalk: To end on a fun note, what is your passion when you’re not focused on Federal IT?

Mark: I’m a big sports fan, and love attending live events with my sons. I’d never claim to be a wine connoisseur, but I do enjoy a glass.

Kurt: Other than trying to find a college for my daughter who is a senior right now, I’ve been a runner for over 40 years. My first race was in 1979, and you can still find me running The National Mall at 5 a.m.

Categories

Recent