Protecting the Internet in the Quantum Age – Part 1
19/09/2024
By Pablo Casal, Netlabs Co-founder & CEO
Introduction
One of the first questions that comes to mind when talking about quantum computers is this: Is it true that Internet security will no longer exist? While the quick answer is ‘no’, it’s difficult to provide a simple, clean answer to this question. However, it is clear where this concern comes from. In 1994, American mathematician Peter Shor demonstrated that his algorithm for quantum computers could find the prime factors of an integer in polynomial time.
Although classical computers have great processing and computational power, certain problems are beyond their reach, such as quickly factorizing very large numbers. Many cryptographic systems leverage this difficulty to protect highly sensitive information on the Internet, for example, our bank accounts.
And while quantum computers are currently in their infancy, it is very likely that the day will come when they become powerful and stable enough to achieve this.
Although a quantum computer with the required size and stability is still 10 to 30 years away, work has already begun on alternatives to traditional algorithms. These alternatives are known as Post Quantum (PQ) algorithms. One of the organizations leading the standardization of PQ algorithms is the National Institute of Standards and Technology (NIST). On 13 August 2024—just five days ago— NIST completed Round 1 of standardization. The winning algorithms were those derived from CRYSTALS-Dilithium, CRYSTALS-KYBER, and SPHINCS.
In the parallel universes interpretation, Shor’s algorithm works because the elements that model quantum states interfere with one another, allowing all possible solutions to be calculated simultaneously – DALL-E
Description of the Vulnerabilities
The Transport Layer Security (TLS) protocol is one of the most widely used security protocols on the Internet today.
It is the security behind the padlock on each secure webpage we visit.
TLS protects every exchange of information between servers and their users. While highly secure in the current context dominated by classical computers, the asymmetric encryption used in TLS is vulnerable to potential attacks from future quantum computers.
Both RSA and ECDH (Elliptic Curve Diffie-Hellman) —some of the algorithms responsible for establishing the master key from which, for instance, the symmetric encryption key is derived— would be vulnerable to Shor’s algorithm if run on a sufficiently large and stable quantum computer. The same applies to the authentication phase, when either the server or the client proves its identity to the other using Public Key Infrastructure (PKI), a process that currently involves signature algorithms such as RSA or ECDSA.
Symmetric cryptography (the algorithm used to encrypt communication in TLS once the parties authenticate themselves with their digital signatures and negotiate the symmetric encryption key) remains relatively unaffected by the quantum algorithm proposed by Lov Grover, who proved in 1996 that his algorithm could find a brute-force solution to the symmetric key in quadratically fewer steps. Order square root of N. In other words, a 256-bit key would require approximately 2^128 iterations. Thus, to maintain security in symmetric encryption the solution would be to increase the size of the AES or SHA-2 key.
DALL-E
First Practical Approach to the Challenge
Beyond purely theoretical considerations, a more practical analysis of which concrete technological measures might be implemented to transition to this new reality would probably reveal the following:
New quantum-resistant PQ algorithms are required, particularly for authentication and key negotiation. The certificates themselves are not vulnerable, and X.509 and the TLS protocol could continue to be used. In principle, only asymmetric encryption algorithms would need to be changed.
NIST and other community initiatives are working on standardizing these quantum-resistant algorithms, which would allow having certificates that use them in the near future. These will be known as Quantum-Resistant Certificates.
A concept increasingly worth becoming familiar with is crypto-agility. An infrastructure is considered crypto-agile if there is a detailed catalog of its cryptographic components, if its encryption algorithms can be easily replaced, and if this replacement process is at least partially automated.
The challenge is estimating the effort needed to migrate the existing structure of the Internet to this new Quantum-Resistant Certificate format. How would this be handled? Would the entire Internet be shut down for a week and then switched back on once all certificates have been migrated? What if half the systems fail to connect due to errors? What if after switching everything back on, half the websites and services no longer work?
At first glance, this ‘Big Bang’ approach seems far from ideal. Changing the entire Internet security infrastructure —which took more than 25 years to build— overnight doesn’t seem possible. However, it is clear that the more crypto-agile the infrastructure is, the easier and more reliable this transition will be.
In some ways, this situation is similar to what we continue to experience with IPv4. The migration to IPv6 did not happen overnight. Instead, it is still happening today and will likely continue in the future.
As with the migration to IPv6, the solution seems to lie in supporting both stacks. Today, modern operating systems can handle both IPv4 and IPv6 connections, so it doesn’t matter if the device initiating the connection supports IPv4 or IPv6 only. It will always be able to connect to the server.
Similarly, it would be desirable to have a temporary transition scenario at certificate and Internet security level, where the use of PQ algorithms is not required for connecting to a remote server. If both client and server support PQ algorithms, a quantum-resistant, secure connection to a service or website would be possible. If not, the connection would still be possible using traditional quantum-vulnerable algorithms, which should probably (and ideally) trigger a warning.
The views expressed by the authors of this blog are their own and do not necessarily reflect the views of LACNIC.