Today every Federal agency is working to implement zero trust. Each will begin in a unique place that is dictated by its current cybersecurity posture, cybersecurity investments, and agency missions. Still, many questions must be answered as agencies plot their zero-trust journey. Which pillar in the Zero Trust Maturity Model is most urgent? Which data is more vulnerable to attacks? How do you find and classify the most sensitive information?
MeriTalk sat down with Justin Wilkins, director of sales engineering, US Public Sector, at Varonis to understand how agencies can implement a data-first approach to zero trust. Today over 7,000 customers, including many important Department of Defense (DoD) missions and large Federal agencies, rely on Varonis to protect their most critical data.
MeriTalk: We often hear that identity is the place to start with zero trust implementation. But Varonis advocates a different approach, securing data first. Why is that?
Wilkins: We believe that zero trust must be built on a foundation of data security. Traditional cybersecurity started at the perimeter. The explosion of remote work and new access points has made the perimeter a lot harder to define and the data much more difficult to secure. Data doesn’t stay on endpoints. It’s stored on modern cloud and on-premises resources that make collaboration easier and security harder.
The potential impact of a compromised account is significant. This means that any zero trust approach must start with the agency’s most valuable asset – its data. Varonis focuses on securing data by automatically reducing the blast radius – the damage attackers can do following a breach – and leveraging machine learning to quickly identify advanced persistent threats, compromised accounts, and insider threats.
Data is almost always the target of any attack, whether it’s a supply-chain incident, rogue nation state, or malware. Before implementing zero trust architecture, we must first focus on identifying and protecting the most sensitive data. The National Security Agency (NSA) calls this out in its zero trust guidance, where the preparation phase includes identifying all sensitive data on the network.
And, in fact, the cybersecurity executive order and the January Office of Management and Budget (OMB) memorandum prioritize data protection. They require that Federal agencies develop data categorization to automatically restrict and monitor the sharing of sensitive content within 120 days. Once we secure our data, the remaining pillars of zero trust: identities, devices, networks, and application workloads, can be built on that foundation.
MeriTalk: How often is data overexposed, either unintentionally or made available to all employees? And how do you see this overexposure affecting Federal agency security?
Wilkins: Focusing on the perimeter creates a massive blind spot and leads to data overexposure. Varonis’s annual global data risk report found that the average user at a large organization has access to over 20 million files and the average organization has 20 percent of its data open to everyone in the network. Large organizations host hundreds of terabytes of data and each terabyte averages 1.2 million folders, each with unique permissions. We can’t implement zero trust when so much sensitive data is wide open.
The adoption of more cloud services and collaboration platforms also significantly increases the blast radius. For many collaboration platforms, security is managed by end users rather than IT. Users can share data and collaborate with people inside and outside of the network, and from any device, without authorization from an administrator or IT security person. This improves collaboration but creates a significant security risk.
MeriTalk: How does Varonis solve this problem?
Wilkins: This is a very difficult problem to solve at scale. Global group access – and especially open access to sensitive data – is the biggest security vulnerability for data. Global group access enables an adversary to access data without having to get creative or elevate their privileges. They just need to authenticate into the network.
Eliminating global group access, especially on sensitive data, and putting security controls in place to ensure that data is only accessible to the right users and groups, will go a long way toward reducing the blast radius. We’re going to stop users from moving laterally and force threat actors to work a lot harder. Varonis does this in a unique way, by observing how users interact with data, correlating that back to permissions, and understanding where we can safely remove access and implement tighter security controls – without affecting the mission.
MeriTalk: How do you partner with Federal organizations to determine what information should be protected or what permissions should be allowed or not allowed?
Wilkins: Our objective is always to implement a least-privilege model on sensitive data. The first step is the discovery process to accurately classify all sensitive data. This is specifically called out in the NSA guidance and the new M-22-09 zero trust memo as a requirement. Organizations develop the criteria to identify sensitive data, and then we extract the file contents and analyze it for sensitivity – using predefined rules like controlled unclassified information, personally identifiable information, and protected health information.
Then we build a map of that sensitive data: who has access to it, how it’s used, and who potentially shouldn’t have access based on the user’s activity and their role in the organization. This gives organizations the ability to proactively eliminate high-risk artifacts like global group access and protect sensitive information without affecting end users.
Next we implement least privilege on the data to prevent users from moving laterally to unauthorized resources, which is a key aspect of zero trust. Then we leverage user behavior analytics and advanced analytics to identify abnormal activity and automate threat detection. IT teams no longer need to comb through raw log information in different formats. We normalize the data and automatically notify organizations when they need to take action. The outcome is that sensitive data is locked down, protected, and monitored for signs of misuse or abuse.
MeriTalk: Where are agencies having the biggest successes as they implement zero trust? And what are their biggest obstacles?
Wilkins: Organizations that fully commit to a zero trust architecture tend to be the most successful. They build teams dedicated to the process. They’re looking at everything holistically to understand the interconnectivity and synergistic value between the different components.
One of the biggest challenges with zero trust is that it completely changes the security paradigm. Organizations traditionally focused on the perimeter, leaving everything inside of the network relatively unprotected. At Varonis, we’ve always referred to this as the candy bar approach – a hardened perimeter with a very soft and vulnerable inside. Now organizations must pivot and focus not only on the perimeter but also on ensuring that they have visibility into how users are accessing resources, systems, and data inside of the network. Monitoring users and data at scale, while also performing the analytics required to detect misuse or abuse of privileges, is a very challenging problem to solve.
Performing large-scale data classification is another big challenge for most organizations. If you can’t identify your most sensitive data, you can’t secure it.
MeriTalk: How does the Varonis data security platform differ from other data security solutions on the market?
Wilkins: Varonis has always taken a data-centric approach to security. Varonis correlates critical metadata streams – like access rights, sensitive data, activity, and more – into a unified platform. This gives us the ability to automatically remediate overexposed data at a very large scale without impacting operations.
Because we’ve been doing this for a very long time, we also have a proven operational method for taking customers from a state of chaos to a secure zero trust architecture for their critical data.