Failure to migrate to new quantum-safe cryptographic algorithms can leave application and data security at risk of compromise. Security and risk management leaders must anticipate “harvest now, decrypt later” attacks by preparing for a move to quantum-safe alternatives immediately.
Overview
Key Findings
- Quantum computing will cause existing asymmetric cryptography to weaken as we approach the end of this decade, leading to its replacement with quantum-safe cryptography.
- Most IT organizations are not aware of the type of cryptography they are using — including which applications are using it, how it is used or who makes decisions about cryptography.
- Developers are often blind to the details of cryptographic and hash function libraries and sometimes even hardware security modules (HSMs) or hard-coded cryptographic dependencies.
- New, quantum-safe algorithms have different performance metrics for standard operations like key generation and encryption/decryption times. Key/data sizes that need to be stored/processed may also be affected. Testing in applications that require specific performance characteristics will need to be a high priority. This will affect data retention and security policies.
- Upgrading cryptographic protocols in most enterprises takes several years and requires IT leaders to start immediately and provide the leadership necessary to deliver crypto transformation.
Recommendations
To help ensure the security of applications and data, security and risk management (SRM) leaders should:
- Prepare for changes to cryptographic algorithms by building an inventory of metadata for applications that use cryptography. This will give your organization a way to scope the impact of new cryptography, determine the risk to specific applications and prioritize incident response plans accordingly.
- Ask the vendors identified in the cryptographic inventory about their plans for becoming quantum-safe, the overall roadmap for its implementation and potential impact on performance that you will need to account for.
- Work with security operations and software engineering leaders planning upgrades to cryptographic dependencies by creating centralized policies governing replacements. The goal is to allow applications to quickly adopt and test new quantum-safe algorithms as they become vetted over the next decade. This is best done through policy, rather than on an ad hoc basis.
- Evaluate your data retention policies and make appropriate changes with expected postquantum cryptography (PQC) life cycle in mind. You’ll need to create and manage policies around algorithm substitution, data retention and the mechanics of how you’re going to swap out or modify your existing use of cryptography.
- Start your quantum crypto assessments today by creating a crypto center of excellence (CCoE) to coordinate cryptographic policy, retain valuable metadata about how algorithms are used and provide expertise to development teams.
Strategic Planning Assumption
By 2029, advances in quantum computing will make conventional asymmetric cryptography unsafe to use.
Introduction
The Rivest-Shamir-Adelman (RSA) algorithm has been the software developer’s workhorse for over 30 years. It has been used in almost every aspect of security — including X.509 certificates, digital signatures and identity and access management (IAM). Unlike cryptographic hash functions, which get broken and replaced relatively often, neither the RSA or elliptic curve cryptography (ECC) algorithms have ever had a successful core algorithmic compromise. However, key cracking is just one of the small sets of mathematically approachable problems the new generation of commercial quantum computers are positioned to solve. Gartner predicts that by 2029, quantum computing will be in a position to weaken existing systems to the point that they are considered unsafe to use cryptographically.
Gartner expects the depreciation of RSA and ECC to unfold much in the way that the Secure Hash Algorithm 1 (SHA-1) digital signature algorithm was progressively weakened over time. The first theoretical attacks on SHA-1 occurred in 2005,1 and by 2010, replacement programs were already well underway, with the National Institute of Standards and Technology (NIST) formally deprecating the algorithm in 2011. As a cautionary tale, according to Venafi, in 2017, 35% of websites were still using SHA-1 although it had been completely compromised at that point.2 This is not a mistake we can afford to repeat.
Replacement of existing algorithms has begun and is expected to accelerate in the wake of the fourth round of the NIST PQC contest.3
NIST has identified the following replacement algorithms:
- CRYSTALS-KYBER (key-establishment) and CRYSTALS-Dilithium (digital signatures): Both selected for their strong security and excellent performance.
- FALCON: For use cases where CRYSTALS-Dilithium signatures are too large.
- SPHINCS+: Standardized to avoid relying only on the security of lattices for signatures.
As security and risk management leaders ponder their next steps, our research highlights the course of action they can take early in the timeline while planning for new cryptography algorithms. It’s too early to begin large-scale replacement, but we do want to be prepared for the changes and, where possible, use the new NIST-approved replacement algorithms for new business opportunities (see Figure 1).
Figure 1: NIST Standardization and Round 4 Selections
Security and risk management leaders need to begin planning for their move to postquantum cryptography now, due to the wide and deep impact of replacing cryptographically dependent systems.
Gartner defines three phases of activity over the next 10 years as quantum computing advances (see Figure 2).
Figure 2: Crypto-Agility Timeline
Analysis
Build an Inventory of Cryptographic Metadata Across Existing Applications
Treat the cryptographic technical debt raised by PQC like any other risk management exercise. Building a database of your existing cryptography is the first step in building an actionable transition plan to postquantum algorithms. It will be used to establish policy for data security, and enable decision making around algorithm replacement.
Identifying areas where cryptography is currently being employed can be a challenging task. Organizations with sufficient development maturity can try to build a tool to do this. This usually involves identifying the sources where keys and certificates get created, which can serve as a basis for identifying the bulk of cryptographic use. The remaining systems, such as unmanaged assets, specialty or custom-made systems etc., can then be found using methods that identify the sources and sinks of network packets containing relevant cryptographic data.
To aid this process, many vendors in the public key infrastructure (PKI) and cryptography space have put together discovery tools that can identify all your organization’s uses of cryptographic algorithms. These tools also identify inventory key lengths, while listing life cycle information on certificates and storing all this information as a metadata database. Generally, these vendors will also provide consulting and expertise to help you manage the items identified as technical debt.
A sample list of vendors, arranged alphabetically, include:
- AppViewX
- Cryptosense (part of SandboxAQ)
- Cryptomathic
- Capgemini
- DigiCert
- Entrust
- EVERTRUST
- ISARA
- IBM
- InfoSec Global
- Keyfactor
- Protiviti
- QuantumXchange
- Senetas
- Thales
- Tyto Athene
- Tychon
- Utimaco
- Venafi
- Venari
Please note that this list is representative only. They are subject to change without notice, and Gartner makes no endorsement of any of these vendors.
Systems that should be included in a scan include:
- Cloud providers: Key management systems, algorithms, configurations
- Applications: Cryptographic libraries, APIs, WAAPs, cryptographic bills of materials (a subset of some some SBOMs)
- Databases and data repositories
- Operating systems: Libraries, key management systems, IAM systems
- Hardware: Network equipment (e.g., routers, firewalls), HSMs, libraries, firmware
- Certificates (including root and SSH certs), OSs, hardware
- Blockchains
- Signed code and signed documents.
All tools, whether built or bought, are on a “best effort” basis and will need to be run in an ongoing basis to identify systems that may not be continuously available
Engage Proactively With Vendors on PQC
Ask the following questions to the vendors you have identified in your cryptographic inventory:
- By when do they plan to be quantum-safe?
- What does their overall roadmap look like?
- What changes in performance would you need to take into account after the transition?
Once you have your inventory in place, you can quickly identify vendor products and systems that have classical cryptographic dependencies. While your organization doesn’t generally need to upgrade those, it should be used to ask the vendors when they expect to be PQC-ready and what impact that might have on performance, upkeep and cost.
Also, PQC-capable vendor products should start to appear on your cryptographic inventory once they are installed and available online. The inventory also serves as a “burn down list,” replacing vendor products that have classical dependencies with their PQC versions.
Working with vendors will also help with your time and cost estimates for the project. As vendors share their PQC plans with clients, it becomes easier to do planning, standing and delivery exercises, which will help in budget forecasting. It’s unlikely that vendors will isolate just the PQC components in a separate SKU, so be sure to ask what other upgrades and services they plan on bundling with their PQC upgrade.
Create and Centralize Cryptographic Policies
Due to the ease of use and reliability of cryptography, only the largest, most-regulated organizations generally have a centralized policy for its use. For most organizations today, decisions are made on an ad hoc basis — by product owners, IT security, vendors or developers — with little input from experts beyond what might be needed for certain regulatory compliance.
This scattered way of looking at cryptography has allowed many outages to take place, and facilitated numerous attacks. This has been acceptable until now because there were not any real long-term consequences beyond the occasional patch. However, with new algorithms coming soon, organizations need to put in additional effort to establish metadata around their use of cryptography, so that it can be evaluated and, if necessary, replaced. To be completely clear, one of the working assumptions that most SRM leaders hold about replacement cryptography isn’t true. Many believe that new algorithms are simply drop-in replacements for existing cryptography, and things can be fixed through simple patching. This does not reflect reality in most cases.
None of the new postquantum algorithms is a drop-in replacement for existing cryptography.
All of the new algorithms have completely different properties from the ones they are replacing (for example, key size, key generation, key exchange and encryption and decryption times).
Where Appropriate, Plan to Extend Current Keys and Hashes
Given the rate of progress in quantum computing and the complexity of the move to new algorithms, it’s possible (at least for the next few years) to simply extend the keys and hashes of existing applications. This is particularly true for low- or medium-risk apps. Longer key sizes in classical cryptography will offer some protection for asymmetric cryptography and hashes. It is expected to offer good protection for symmetric keys, going forward. For example, cryptographic algorithms such as AES-256 should be robust against quantum computers for at least the next decade or so. Though longer asymmetric keys will help to an extent, we expect the benefit will fade over time and potentially end abruptly by the close of the decade. At best, this offers a delay in the HNDL attack, so only use this technique if the data will not contain any sensitive material beyond the year 2029.
It’s also important to keep in mind that symmetric and asymmetric use cases are very different and AES-256 is not an appropriate substitution for almost any RSA or ECC application. This guidance applies beyond just software and applications. They encompass all uses of cryptography, including HSMs, FIDO tokens, smart cards, firmware, hardware and more.
Evaluate Your Data Retention Policies With the Cryptography Life Cycle in Mind
As seen in Figure 2, the data retention life cycles will need to be reassessed considering any significant changes that can happen over the next five years or so. Never before have organizations had to change so many data security parameters in such a short amount of time. This is why maintaining a database of your cryptography metadata will be a key requirement going forward. You’ll need to create and manage policies around algorithm substitution, data retention and — from a development point of view — assess how you’re going to replace or modify your existing use of cryptography.
Given the inflation in the sizes of ciphertexts, Gartner also recommends conducting a review of your data retention and encryption policies. In most cases, it’s not as simple as reencrypting everything with a new algorithm. You’ll need to take the lifetime of the data into account and plan for key lengths and algorithms that match the data lifetime against the expected progress of quantum computing.
In previous cryptographic migrations, Mosca’s Theorem has proven to be a good guide for planning. It states that the amount of time you need your encrypted data to be secure (X) + how long it will take to upgrade your systems to PQC (Y) should be less than the time it will take state actors to develop a cryptographically relevant quantum computer (Z).
X+Y < Z
Applying this to existing data life cycles, Gartner recommends the following course of actions (see Table 1):
Table 1: Data Life Cycle Planning Guidance
Enlarge Table
Data Life Expectancy | Action | Long-Term Plan |
Data will no longer be used by 2026 | Leave as is | End of life or reevaluate in 2026 |
Data will be used in 2026, but will no longer be used by 2028 | Move to larger classical keys or hashes for the transition | Reevaluate in 2028 |
Data will be used in 2027, but will no longer be used by 2030 | Evaluate PQC | Reencrypt or end of life |
Data will be used beyond 2031 | Plan for PQC migration | Migrate to protect against harvest now, decrypt later (HNDL) attacks |
Currently, the most pressing threat to data security is the HNDL attack. In this scenario, the adversary, often a state actor, leverages a variety of conventional and advanced persistent threat (APT) techniques to exfiltrate sensitive data that will continue to have value more than five years into the future. These data can be documents, material non-public information (MNPI), private customer data, M&A plans, intellectual property, or any form of information that can be successfully leveraged in the future. While the data is usually encrypted at rest, copies are stored by the adversary with the goal of reading them when they have a quantum computer or other technique for decrypting the information. Information that has value only in the next few years is considered relatively safe (for example, session data protected in current versions of TLS), since it’s unlikely to be worth the wait.
Source: Gartner (July 2024)
Obviously, this will vary according to your specific needs, regulatory and compliance framework, and existing data-retention and privacy policies. This activity can go hand-in-hand with establishing a cryptographic center of excellence for creating and administering policies for data life cycle and cryptography.
Establish a Cryptographic Center of Excellence
The most effective way to manage and control the use of cryptography is through establishing a single team that has the expertise needed to make effective policy for the organization. The CCoE can serve to avoid single-point decisions about security technology. While decisions like this are familiar territory for SRM leaders, there can be expensive, unexpected consequences if these decisions are mismanaged.
A CCoE is a way of combining many of the functions described above, into one coherent system that allows consistency and scale across the enterprise.
Gartner recommends establishing a CCoE to:
- Establish a consistent cryptographic policy across the enterprise. This will answer questions about what to do with impacted systems and the best ways to manage their life cycle before, during and after the transition to PQC.
- Create and maintain a database of cryptographic metadata. The CCoE would maintain and update the database and track progress from a central location as systems move to PQC.
- Track vendors’ products for PQC updates and compatibility. Most organizations have multiple vendor products and the CCoE is a good place to maintain tracking on where they are during the transition.
- Work with on-prem developers to remand custom-made applications. Most developers are not cryptography experts, especially with PQC. The CCoE can take policy and craft it into workable crypto-agile guidance for development teams updating applications.
- Interface with the cryptographic community. NIST and other national-level organizations have already announced more contests for PQC, and Gartner expects many new cryptographic algorithms and formats to be standardized in the next decade. The CCoE will act as a clearinghouse for this information, updating policy and providing guidance for the org as new and better choices for PQC get vetted.