Most of today’s cryptographic applications, including fundamental applications like secure web, secure email, DRM and even Bitcoin, rely on a rather small set of well-established algorithms that may easily be opened (i. e. broken) with a future class of computers that leverage quantum technology.
Although practical quantum computers don’t exist yet, it’s clear they are on the horizon, and for our curiosity and livelihood as security experts in the content protection field, we have looked into status, threats and remedies. We found the battle between breaking crypto and hardening algorithms against this attack is in full force.
Today’s cryptography relies on a few data transformations that are used to secure communication:
- Secret key encryption is “obfuscating” data between parties by using a previously, securely exchanged secret (e.g. password) using e. g. A. E. S. (Advanced Encryption Standard) as a standard encryption algorithm.
- A hash function makes it easy to calculate a number (or hash) from a document but hard to find a document that matched this given number. SHA256 is an example of a hash function.
- Asymmetric-key algorithms use a key for encryption that is different than the key used for decryption. They are used to communicate and the owner of a private key can prove that she owns it without revealing it, e.g. RSA leveraging the Integer Factorization Problem.
These algorithms base the security on the simple fact that the data is secure because trying all possibly keys will take too much processing power, time, energy and money… with today’s computers.
The concept of a quantum computer was already suggested around 1982 i. e. by the physicist Richard Feynman. Essentially a quantum computer may be thought of as a novel type of computing machine which allows computations governed by quantum-mechanical processes to allow a “massive parallelism at the physical level” based on the superposition principle of quantum states which would dramatically speed-up basic algorithms which when run on classical computers take decades or would dissipate enormous heat, making the calculation un-stable. One of the limiting factors is the fact that the clock speed of classical transistors cannot be raised beyond certain technical/physical limits. The other factor is the difficulty to densely pack gates in two- or three-dimensional structures which imposes limits to the maximal “wiring” complexity of a circuit.
The idea behind quantum computers is the ability to run many operations in parallel - a “quantum leap” that’s enabling an entire new class of algorithms, one of which proposed by Peter Shor in the mid 1990’s. For now, this remains an idea: Currently the main obstacle for building a sufficiently powerful quantum computer is to provide fault-tolerant computation. Quantum power is not measured in bits but qubits. A powerful system should have more than say 50 qubits and e.g. 1000 qubits are completely out of reach currently. Such a quantum system has to cope with the fundamental “phenomenon of quantum decoherence“ that arises from the difficulty to shield the internals of a quantum system against the interaction with the physical environment which could perturb the quantum states. Remarkably, the number of practical qubits is steadily increasing. A quantum computer operating with 16 qubits (see IBM_16_qubit_quantum ) has already been built by a team of researchers at IBM and announcements for 20 qubits products and a 50 qubits prototype followed in short notice . For more information see also the links at the bottom of this post which give an overview about IBM’s quantum computing development.
Whether a quantum computer capable of breaking say RSA-2048 (using Shor’s algorithm) will exist sometime in the nearby future … no one knows currently.
But the most exciting development of quantum computing is certainly not to break crypto, but constructive applications that require extraordinary processing power like modelling a chemical reaction, or drug discovery by finding a molecule with specific properties.
How is this Attack Applied?
As mentioned above, there are two different classes of algorithms: symmetric key algorithms or hash functions and asymmetric algorithms (RSA, ECDH).
Up to now the main “quantum computing threat” to the security of symmetric key algorithms and the security of hash functions is the so-called Grover’s algorithm which is a generic quantum search algorithm. This algorithm allows to speed-up an exhaustive key search by speeding it up in a way that effectively halves the key length. Thus the security of a block cypher like A. E. S. with key length 128 bits is reduced to 64 bits only. A similar speed-up also holds for pre-image search for generic hash functions (like SHA256). Therefore the security loss from quantum computing attacks can be mitigated with a fairly easy fix by doubling the key size for symmetric ciphers and doubling the hash (output) length for one-way hash functions.
For the commonly used asymmetric algorithms the situation is completely different: quantum computers will lead to an “exponential speed-up” compared to the known break/reversion algorithms when trying to determine the secret key, given the public key.
Up to now the best known algorithm for factoring large integers is the General Number Field Sieve which allows to factorize an integer N in expected time about exp(log1/3(N)) and a similar bound holds also for the time complexity of the classical discrete logarithm problem in finite fields. Even harder is the algorithmic complexity of the Elliptic Curve Discrete Logarithm Problem (ECDLP): if the point group of the underlying elliptic curve (in which the discrete logarithm problem has to be solved) has order N there is no known attack which runs in time substantially less than the square root √N. For a state-of-the-art 256-bit elliptic curve (this means that the finite field defining the curve has about 2256 elements) the attack complexity (or attack run time) is about 2128. An overview about the “performance gain” which will be achieved through quantum computers when solving classical algorithmic problems from mathematics, computer science and cryptography can be found under Quantum_Algorithm_Zoo.
A quantum computing proposed algorithm like Shor’s in theory simply solves the time intense calculations required to break asymmetric cryptography today such as all RSA-based schemes and classical ElGamal-public key encryptions and Diffie-Hellman key agreements. On the other hand for large keys, quantum computing needs to be powerful and it is not at all clear at the moment whether a quantum computer with say > 1000 qubits capable of running Shor’s algorithm will ever exist.
How is this Going to Hurt?
We at Verimatrix are using these algorithms to protect studio content and operator’s revenue, and however important that seems to us, we also do realize that this is not the biggest damage caused by breaking encryption.
One of the bigger threats for us stems from the fact that not only future communication can be broken that will be used with the knowledge of quantum computing but that today’s communication is at risk in the future. Specifically information encrypted today and transmitted on the internet can be stored and decrypted (or harvested) later, if the ability becomes available.
It seems now is the time to track and consider progress in the future that can have an impact on today’s data if not protected with a vision that extends as far in the future as your security requirements. Looking into this future means envisioning what’s called Post-Quantum Cryptography -- a term introduced by the cryptographer Daniel Bernstein, referring to a rapidly growing research area of cryptography whose goal is to develop different cryptographic algorithms and security protocols which stay safe even if powerful quantum computers are built. To be able to still maintain the security of the current crypto infrastructure when quantum computers are available a portfolio of algorithms should be already well-established (ready for operational use) which allows a smooth crypto migration of the currently used crypto protocols to quantum-safe versions. The project page PQCrypto devoted to post-quantum cryptography collects some of the already proposed algorithms and also lists many research papers in the field of post-quantum cryptography.
Who Can Save Crypto from Quantum Computing?
Aware of the looming sea change and requirement for preparation, there are some groups that aim to identify useful and safe algorithms and prepare cryptography for post quantum era. One of them is the Open Quantum Safe Project, which provides an open research platform to promote the development of future quantum-safe algorithms, ranking aspects like security, integration into established security protocols, implementation and computation efficiency. This includes security proofs for the proposed algorithms as well as working to analyze the time it would take to break the algorithmic and finally testing of implementations of the schemes on various platforms to test real world suitability.
Currently there are several different proposals mainly for un-authenticated raw key exchange (roughly comparable to “raw” Diffie-Hellman key exchange) and also for public-key encryption. Still slightly missing in the portfolio are “true” digital signature schemes (comparable to ECDSA in elliptic curve cryptography) which will be probably added in the near future. One of the current proposals submitted to the Open Quantum Safe Project (mentioned under Kyber in the table below) also contains a construction for authenticated (i. e. digitally signed) key exchange, in addition there are several potential candidates under development (e. g. the schemes BLISS , ring_TESLA ). It should be noted here that there already exist quite mature post-quantum signature schemes based on hash functions (so-called Hash-based signature schemes based on hash trees like XMSS and the recent proposal SPHINCS, see e. g. PQHashSignatures or XMSS_SPHINCS_Sketch which give a rough description of these schemes).
Below is a list of promising candidates submitted to the Open Quantum Safe Project:
Hard algorithmic problem
(Raw) Key exchange
RLWE-(Ring Learning With Errors)
Erdem Alkim, Léo Ducas, Thomas Pöppelmann , Peter Schwabe
(Raw) Key exchange
RLWE-(Ring Learning With Errors)
Joppe W. Bos, Craig Costello, Michael Naehrig, Douglas Stebila
(Raw) Key exchange
LWE = (Learning With Errors)
Joppe Bos, Craig Costello, Léo Ducas Ilya Mironov, Michael Naehrig , Valeria Nikolaenko, Ananth Raghunathan, Douglas Stebila
Several (public key encryption, authenticated key exchange)
MLWE = (Module Learning With Errors)
Joppe Bos, Léo Ducas, Eike Kiltz, Tancrède Lepoint, Vadim Lyubashevsky, John M. Schanck, Peter Schwabe, and Damien Stehlé
Public-key encryption (NTRUEncrypt),
Digital signature (NTRUSign)
CVP (closest vector problem in lattice theory)
Jeffrey Hoffstein, Jill Pipher, Joseph H. Silverman
(patent-pending until 2021, owned by NTRU)
New variant by:
Damien Stehlé and Ron Steinfeld
(Raw) Key exchange
Isogenies of super-singular elliptic curves
Finding isogenies between super-singular curves
Luca De Feo, David Jao, Jerome Plut
Coding theory (theory of error-correcting codes)
Decoding “noisy” code vectors in a binary code
Daniel J. Bernstein, Tung Chou, Peter Schwabe
The first four schemes NewHope, BCNS15, Frodo and Kyber listed above are all based on recent constructions from an area of cryptography known under the label „lattice cryptography“ (see the Wikipedia article on lattice cryptography ) which can be traced back to late nineteenth century math (an area today known as Geometry_Of_Numbers). In particular the so-called “LWE = (Learning With Errors)-problem”, the “RLWE-(Ring Learning With Errors)-problem” and the “MLWE-(Module Learning With Errors)-problem” mentioned in the fourth column are different basic “algorithmically hard problems” in this area which are assumed to stay algorithmically hard even for powerful quantum computers (they are thus considered “quantum-safe”). The last scheme McBits is a modern variant of a public-key encryption algorithm based on coding theory invented by Robert McEliece in 1978 and known as the McEliece_cryptosystem .
What Does this Mean for Us?
It seems that cryptographers ‘have it covered’ and while this may be true to a large extent - when this computing revolution becomes reality, a lot of software has to be rewritten and concepts have to be re-considered. This will concern us, at Verimatrix to maintain state of the art content security as well as many others that rely on the technology to keep their data, identity and communication confidential. Though this seems to become relevant at some more or less distant time in the future it is already a reasonable assumption that any data that’s encrypted now can be stored and maintained to be decrypted at some point in the future. For a lot of information it may not be relevant and for content protection as example, it may not matter if the protection level decreases over time as this is the case for the value of the content itself as well. But there is information that has the requirement to stay confidential for much longer time and should be treated as such …
For more info see also an article from 1996 by David DiVincenzo entitled “Topics in Quantum Computers“ which contains a list of “five engineering design requirements” for building a larger-scale quantum computer at http://arxiv.org/pdf/cond-mat/9612126v2.pdf