PQC Drama: Have Kyber and Dilithium been broken?
What happened. Why it's important but not critical. What to do next. Plus a significant last-minute update.
Scroll to bottom for TL;DR
What’s happened?
There’s been excitement and some anguish in the Postquantum Cryptography real since YiLei Chen, from Tsinghua University, posted a pre-print on the IACR server with the toned-down title “Quantum Algorithms for Lattice Problems”. Under this academic-looking title lies what some look at as a new “Shor moment” for cryptography: a claim that a (quantum) algorithm exists that, when run on a sufficiently powerful quantum computer, would be able to solve Learning With Errors (LWE) problems in polynomial time - in contrast to any other algorithm known so far, that requires exponential time.
Why is this important?
The computational hardness of solving LWE problems with either classical or quantum computers underpins the security of a family of cryptographic algorithms, dubbed “lattice-based algorithms”, that have been pre-selected by NIST for standardization as encryption algorithms for key encapsulation (KEM) and digital signature. The current timeline indicates that NIST is going to publish these standards by mid-2024.
If Chen’s result is confirmed, it would mean that lattice-based PQC algorithms are not as robust against quantum attacks as previously thought. Indeed, the ‘acceleration’ from exponential resolution time to polynomial is qualitatively comparable to that offered by Shor’s algorithm against the integer factorization problem (which underpins the security of RSA) and the discrete logarithm problem (which underpins Elliptic Curve cryptography (ECC)). Given that the quantum threat posed by Shor’s algorithm against RSA and ECC is very much responsible what led in the first place NIST to launch a PQC competition in 2016, to seek substitutes for these ciphers, this could be a big deal in cryptography.
Moreover, there is no reason to think that Chen’s algorithm is the last word. On the contrary, it could well be an inspiration for cryptologists and quantum-algorithm developers to come up with even better algorithms, able to break lattice-based encryption with fewer resources.
Is this urgent? No
The points above may make you think that something critical has happened, and wonder if you should take any action. If you’re thinking that, you’re not alone. Some specialist voices have wondered even if NIST should delay the very much anticipated official standardization of lattice-based algorithms for key establishment and digital signature applications until the true extent of this threat has been analysed. NIST has said they do not want to delay the standardization process. Is that the right decision? I think so. Here’s why.
As Chen explicitly discusses in this work, the potential attack would still require both an advanced quantum computer (a ‘cryptographically relevant quantum computer’, CRQC), none of which currently exist.
If you think one needs to start taking action well before CRQCs are on the market (e.g., because of te ‘harvest now, decrypt later’ risk, or Mosca’s theorem), you’re absolutely right, as I discussed in my earlier post, “2024: A quantum leap year for Cybersecurity”. (If you didn’t have that in mind, now’s the time to check that post!)
But Chen’s algorithm running on a fully fledged CRQC would still require a considerable time to run, roughly scaling like n4.5 to solve a LWE problem in a n-dimensional lattice. In practice, this means that you could still protect your data using LWE algorithms for as long as you wish setting a (very) long key — much like we have been increasing the size of RSA or ECC keys in the past — remember when NIST recommended to move from 1024-bit RSA keys to 2048-bit keys (NIST SP 800-57 Part 3 rev1)?
Lattice-based algorithms is not all we have at hand. While NIST plans to standardize only one PQC algorithm for key establishment (CRYSTALS-Kyber, a.k.a. ML-KEM), and this happens to be lattice-based, it is not the only kind of PQC algorithm on the table: there are hash-based, code-based, multivariate-based, isogeny-based, and supersymmetric-key algorithms out there too. NIST has been fairly transparent all along that it would be unwise to put all one’s eggs in a single basket, being mindful that one of the (pre-)standardized algorithms could be shown to have a vulnerability. That this has happened now doesn’t mean one should stop the migration process.
What should I do? Crypto-agility is the new black
This brings me to the key consideration: what could happen next and how do you best prepare? What actions should you take, if any? Let us put on a foresight hat and consider three potential future scenarios:
Best case
Someone quickly finds an error in Chen’s work and LWE is again (believed to be) quantum-hard and, hence, lattice-based cryptography is again (believed to be) quantum-safe. In this case, there’s no reason for NIST not to standardize Kyber/ML-KEM and Dilithium/ML-DSA; for network and cybersecurity vendors not to work to include it as an option in their products as soon as possible (to allow them to comply with CNSA 2.0 advisory); nor for end-users not to consider adopting them as part of their suite of crypto-algorithms looking ahead.
Probable case
Chen’s findings are confirmed and lattice-based cryptography is understood to have some vulnerability to quantum attacks. This means LWE-based algorithms are no longer quantum-safe — but organizations who have started critically analysis their cryptographic usage and needs, and possibly even upgraded cybersecurity requirements and policies have not wasted time nor resources. In fact, they are in a good position to pivot to alternative PQC solutions: the work they’ve done identifying and categorising their cyber-security needs, inventorying their cryptographic solutions, and building a cryptographic agile framework put them in an excellent position to mitigate any risks — certainly in a much better positions than organizations who have kept a rigid attachment to good ol’ ciphers like RSA, ECC.
Worse case
Building on Chen’s work or otherwise, an algorithm is found that is able to break lattice-based cryptography with very limited computing resources (say, one that scales linearly with the length of the key and requires relatively few qubits).
Now, this potential ‘worse’ future has always been in the community’s mind — as one would expect from cyber-security experts. And that is why a broad range of solutions has been, and continues to be, under consideration (cf. NIST’s call for additional algorithms).
Again, organizations who have analysed their cryptographic usage and needs, updated their internal policies and standards to be more cryptographically agile (‘crypto-agile’), upgraded as needed their network infrastructure, and who have in a fluent relationship with their network and cyber-security providers, are in a good position to keep their data and communications secure.
Summary — TL;DR
A recent pre-print by Yilei Chen suggests lattice-based cryptography could be vulnerable to attacks by cryptographically relevant quantum computers. If this is confirmed, this would mean lattice-based encryption algorithms like CRYSTALS-Kyber/ML-KEM (FIPS 203), CRYSTALS-Dilithium/ML-DSA (FIPS 204) are not longer fully quantum-safe.
NIST is not planning on delaying the final publication of standards for ML-KEM and ML-DSA. This is an important decision to properly kick-start the roll-out of certification procedures and putting on the market of quantum-safe solution, hence I agree with NIST’s position.
As an organization, you should not delay taking steps to become cryptographically agile. First steps should include:
Analysed your current cryptographic usage and needs, bearing in mind the lifetime value of the data your wish to protect.
Updated your internal policies and standards to be cryptographically agile (‘crypto-agile’)
Discuss with your current network and cybersecurity providers about their plans to roll out quantum-safe products and solutions. Consider discussion with other vendors too, to get a broader view, e.g., attending events like the “Quantum Computing for Telecommunications” session at RSA Conference 2024.
If you’re totally lost regarding cryptographic agility, I suggest you start by checking out this excellent Guidance by the Canadian Center for Cyber-security.
Last minute update
PS: On 18th April, Chen published an update to his paper following communications by two researchers pointing to a bug in his derivation (my boldface):
Update on April 18: Step 9 of the algorithm contains a bug, which I don’t know how to fix... I sincerely thank Hongxun Wu and (independently) Thomas Vidick for finding the bug today.
Now the claim of showing a polynomial time quantum algorithm for solving LWE with polynomial modulus-noise ratios does not hold. I leave the rest of the paper as it is... as a hope that ideas like Complex Gaussian and windowed QFT may find other applications in quantum computation, or tackle LWE in other ways.
It follows that, at least for now, we’re in the ‘best case’ situation and LWE-based algorithms are still quantum safe. I should like to praise Chen’s transparency and attachment to science’s best practices in maintaining the rest of the pre-print out there as a source of ideas for other researchers.