By Dr. Jim Matney, Vice President and General Manager, DISA and Enterprise Services, at General Dynamics Information Technology (GDIT)
As almost any cybersecurity professional would tell you, you can’t reliably know what vulnerability a hacker will find and exploit. To avoid an attack, your defenses must be right 100 percent of the time. The hacker only must be right once.
Quantum computing turns that all on its head. Why? Two reasons.
First, quantum computers are exponentially more efficient than classical computers for certain problems and can support more advanced and complex compute applications. The emergence of quantum as a compute resource for solving specific computing challenges is at the same time full of promise and peril.
Second, some encryption algorithms being used today can be hacked with already-built quantum algorithms. For a long time, that was fine because there were not quantum computers big enough or fast enough to crack them (i.e., crypto-analytically relevant quantum computers or CRQC). But that’s changing. American adversaries are investing heavily in the quantum computing space, making the threat of quantum-based attacks against our encryption algorithms a much more compelling one.
Moreover, we are reliant on encryption because the networks our data travel on can be intercepted. A persistent fear is that – even if bad actors can’t yet do anything with our data – they can harvest it now, store it, and play it back later.
So, what are Federal agencies to do in the face of this reality?
As the National Institute of Standards and Technology (NIST), the Cybersecurity and Infrastructure Security Agency (CISA), and the National Security Agency (NSA) all advise, agencies should continue to conduct good cyber hygiene practices and not yet purchase enterprise quantum-resistant algorithm solutions, except for piloting and testing. NSA expects a transition to quantum resistant algorithms will take place by 2035 once a national standard is adopted. In the meantime, however, CNSA (Commercial National Security Algorithm) 1.0 and 2.0 algorithms offer the benchmarks for national security systems.
Additionally, agencies should take stock of their cryptological assets and ensure they have confidence that their scans are not missing particular devices or data stores and connections to internet of things (IoT) and peripheral devices. Agencies should also ensure they have a clear understanding and rating of the criticality and sensitivity of the various sets of data within their organization. While all data is important, bandwidth and resource limitations do exist, so agencies must develop a road map that ensures the highest-sensitivity data is secured first.
These are good practices in line with a zero trust approach, but the quantum threat provides extra impetus for getting this done as soon as possible. In addition, the White House issued a national security memorandum that sets the requirements for Federal agencies to prepare for this threat.
GDIT has developed a framework to help agencies prepare for the quantum threat and navigate the new standards that will ultimately be issued for agencies to implement, which are expected from NIST by 2024. The GDIT Quantum Resilience Framework is a risk-based, post-quantum cryptography implementation approach. It includes these steps:
Begin by looking at your overall encryption risk profile. Some algorithms will be severely impacted by quantum computing, and some will be less impacted. Agencies should know where they are in relation to the latest NIST encryption standards and understand the risks of not being up to date.
Examine what encryptions are used throughout your organization and what services they support. This can include web browsing, email, digital signatures, message digests, key exchanges, VPN, enterprise data center transport and data at rest. Determine which ones are most important to protect and understand the impact of a potential quantum-enabled attack.
With the impact assessment complete, create a risk response strategy (e.g., accept, avoid, transfer, or mitigate). Then, prioritize your risk categories (critical, high, medium, low) and define risk tolerance statements for each. In these statements, articulate what actions you’ll take, in what order, who will perform them, and on what time horizon.
Once you’ve prioritized the actions to take to protect critical services, examine the available solutions to address them. There should be no appetite for accepting risk where there is an approved quantum-resistant solution available. Agencies should explore viable near-term solutions to counter the “harvest and replay later” threat. Long-term solutions should be based on approved NIST and/or NSA standards.
Implementing quantum-resistant algorithms that drive resiliency is a logical next step, once NIST has fully vetted and provided guidance. Part of the implementation process involves following the risk prioritization schedule and being clear about what solutions will be implemented in what sequence.
Track to Completion
Agencies should be sure to track and document their solution implementation. This will provide a roadmap for future updates when new standards are released.
As with anything cybersecurity-related, agencies should continuously monitor their encryption risk. Expect standards and solutions to be updated frequently in line with quantum advancements, as well as the advancements in the sophistication of hacking techniques.
To be quantum resilient across the enterprise, agencies should plan and budget for these activities now so that they can be prepared to implement new solutions as soon as the new standards are released. The goal is to conduct proactive planning that drives future security; to improve trust in data confidentiality and integrity; to lower the risk of the pending quantum threat to current encryption algorithms; and to consistently broaden awareness of quantum’s impact to cybersecurity across the enterprise.