The Impact of Quantum Computing on Present Cryptography
The aim of this paper is to elucidate the implications of quantum computing in present cryptography and to introduce the reader to basic post-quantum algorithms. In particular the reader can delve into the following subjects: present cryptographic schemes (symmetric and asymmetric), differences between quantum and classical computing, challenges in quantum computing, quantum algorithms (Shor’s and Grover’s), public key encryption schemes affected, symmetric schemes affected, the impact on hash functions, and post quantum cryptography. Specifically, the section of Post-Quantum Cryptography deals with different quantum key distribution methods and mathematical-based solutions, such as the BB84 protocol, lattice-based cryptography, multivariate-based cryptography, hash-based signatures and code-based cryptography.
Index Terms:quantum computers; post-quantum cryptography; Shor’s algorithm; Grover’s algorithm; asymmetric cryptography; symmetric cryptography
There is no doubt that advancements in technology and particularly electronic communications have become one of the main technological pillars of the modern age. The need for confidentiality, integrity, authenticity, and non-repudiation in data transmission and data storage makes the science of cryptography one of the most important disciplines in information technology. Cryptography, etymologically derived from the Greek words hidden and writing, is the process of securing data in transit or stored by third party adversaries. There are two kinds of cryptosystems; symmetric and asymmetric.
Quantum computing theory firstly introduced as a concept in 1982 by Richard Feynman, has been researched extensively and is considered the destructor of the present modern asymmetric cryptography. In addition, it is a fact that symmetric cryptography can also be affected by specific quantum algorithms; however, its security can be increased with the use of larger key spaces. Furthermore, algorithms that can break the present asymmetric cryptoschemes whose security is based on the difficulty of factorizing large prime numbers and the discrete logarithm problem have been introduced. It appears that even elliptic curve cryptography which is considered presently the most secure and efficient scheme is weak against quantum computers. Consequently, a need for cryptographic algorithms robust to quantum computations arose.
The rest of the paper deals initially with the analysis of symmetric cryptography, asymmetric cryptography and hash functions. Specifically, an emphasis is given on algorithms that take advantage of the difficulty to factorize large prime numbers, as well as the discrete logarithm problem. We move on by giving an introduction to quantum mechanics and the challenge of building a true quantum computer. Furthermore, we introduce two important quantum algorithms that can have a huge impact in asymmetric cryptography and less in symmetric, namely Shor’s algorithm and Grover’s algorithm respectively. Finally, post-quantum cryptography is presented. Particularly, an emphasis is given on the analysis of quantum key distribution and some mathematical based solutions such as lattice-based cryptography, multivariate-based cryptography, hash-based signatures, and code-based cryptography.
II Present Cryptography
In this chapter we explain briefly the role of symmetric algorithms, asymmetric algorithms and hash functions in modern cryptography. We analyze the difficulty of factorizing large numbers, as well as the discrete logarithm problem which is the basis of strong asymmetric ciphers.
II-A Symmetric Cryptography
In symmetric cryptography, the sender and the receiver use the same secret key and the same cryptographic algorithm to encrypt and decrypt data. For example, Alice can encrypt a plaintext message using her shared secret key and Bob can decrypt the message using the same cryptographic algorithm Alice used and the same shared secret key. The key needs to be kept secret, meaning that only Alice and Bob should know it; therefore, an efficient way for exchanging secret keys over public networks is demanded. Asymmetric cryptography was introduced to solve the problem of key distribution in symmetric cryptography. Popular symmetric algorithms include the advanced encryption standard (AES) and the data encryption standard (3DES).
II-B Asymmetric Cryptography
Asymmetric cryptography or public key cryptography (PKC) is a form of encryption where the keys come in pairs. Each party should have its own private and public key. For instance, if Bob wants to encrypt a message, Alice would send her public key to Bob and then Bob can encrypt the message with Alice’s public key. Next, Bob would transmit the encrypted message to Alice who is able to decrypt the message with her private key. Thus, we encrypt the message with a public key and only the person who owns the private key can decrypt the message.
Asymmetric cryptography additionally is used for digital signatures. For example, Alice can sign a document digitally with her private key and Bob can verify the signature with Alice’s known public key. The security of PKC rests on computational problems such as the difficulty of factorizing large prime numbers and the discrete logarithm problem. Such kind of algorithms are called one-way functions because they are easy to compute in one direction but the inversion is difficult [Dusek2006381].
II-B1 Factorization Problem - RSA Cryptosystem
One of the most important public-key schemes is RSA invented by Ronald Rivest, Adi Shamir, and Leonard Adleman in 1977. RSA exploits the difficulty of factorizing bi-prime numbers. According to Paar and Pelzl [Paar2010], RSA and in general asymmetric algorithms are not meant to replace symmetric algorithms because they are computationally costly. RSA is mainly used for secure key exchange between end nodes and often used together with symmetric algorithms such as AES, where the symmetric algorithm does the actual data encryption and decryption. Kirsch [Kirsch2015] stated that RSA is theoretically vulnerable if a fast factorizing algorithm is introduced or huge increase in computation power can exist. The latter can be achieved with the use of quantum mechanics on computers, known as quantum-computers.
II-B2 Discrete Logarithm Problem (DLP)
Asymmetric cryptographic systems such as Diffie-Hellman (DH) and Elliptic Curve Cryptography (ECC) are based on DLP. The difficulty of breaking these cryptosystems is based on the difficulty in determining the integer such that . The integer is called the discrete logarithm problem of to the base , and we can write it as . The discrete logarithm problem is a very hard problem to compute if the parameters are large enough.
Diffie-Hellman is an asymmetric cipher that uses the aforementioned property to transmit keys securely over a public network. Recently, keys larger or equal to bits are recommended for secure key exchange. In addition, another family of public key algorithms known as Elliptic Curve Cryptography is extensively used. ECC provides the same level of security as RSA and DLP systems with shorter key operands which makes it convenient to be used by systems of low computational resources. ECC uses a pair that fits into the equation together with an imaginary point (theta) at infinity, where and [Paar2010]. ECC needs a cyclic Group G and the primitive elements we use, or pair elements, to be of order G. ECC is considered the most secure and efficient asymmetric cryptosystem, but this tends to change with the introduction of quantum computers as it is explained in the next sections.
III Quantum Computing vs Classical Computing
In 1982, Richard Feynman came up with the idea of quantum computer, a computer that uses the effects of quantum mechanics to its advantage. Quantum mechanics is related to microscopic physical phenomena and their strange behavior. In a traditional computer the fundamental blocks are called bits and can be observed only in two states; 0 and 1. Quantum computers instead use quantum bits also usually referred as qubits [Nielsen:2011:QCQ:1972505]. In a sense, qubits are particles that can exist not only in the 0 and 1 state but in both simultaneously, known as superposition. A particle collapses into one of these states when it is inspected. Quantum computers take advantage of this property mentioned to solve complex problems. An operation on a qubit in superposition acts on both values at the same time. Another physical phenomenon used in quantum computing is quantum entanglement. When two qubits are entangled their quantum state can no longer be described independently of each other, but as a single object with four different states. In addition, if one of the two qubits state change the entangled qubit will change too regardless of the distance between them. This leads to true parallel processing power [Quantum_entanglement]. The combination of the aforementioned phenomena result in exponential increase in the number of values that can be processed in one operation, when the number of entanglement qubits increase. Therefore, a n-qubit quantum computer can process operations in parallel.
Two kinds of quantum computers exists; universal and non-universal. The main difference between the two is that universal quantum computers are developed to perform any given task, whereas non-universal quantum computers are developed for a given purpose (e.g., optimization of machine learning algorithms). Examples are, D-Wave’s 2000 qubits non-universal quantum computer [D-wave_interview] and IBM’s 17 qubits universal quantum computer with proper error correction. IBM’s quantum computer is currently the state of the art of universal quantum computers [soeken2018programming]. Both D-Wave and IBM have quantum computers accessible online for research purposes. Additionally, in October 2017, Intel in collaboration with QuTech announced their 17-qubits universal quantum computer [soeken2018programming].
Bone and Castro [Bone1997] stated that a quantum computer is completely different in design than a classical computer that uses the traditional transistors and diodes. Researchers have experimented with many different designs such as quantum dots which are basically electrons being in a superposition state, and computing liquids. Besides, they remarked that quantum computers can show their superiority over the classical computers only when used with algorithms that exploit the power of quantum parallelism. For example, a quantum computer would not be any faster than a traditional computer in multiplication.
III-A Challenges in Quantum Computing
There are many challenges in quantum computing that many researchers are working on.
Quantum algorithms are mainly probabilistic. This means that in one operation a quantum computer returns many solutions where only one is the correct. This trial and error for measuring and verifying the correct answer weakens the advantage of quantum computing speed [Kirsch2015].
Qubits are susceptible to errors. They can be affected by heat, noise in the environment, as well as stray electromagnetic couplings. Classical computers are susceptible to bit-flips (a zero can become one and vise versa). Qubits suffer from bit-flips as well as phase errors. Direct inspection for errors should be avoided as it will cause the value to collapse, leaving its superposition state.
Another challenge is the difficulty of coherence. Qubits can retain their quantum state for a short period of time. Researchers at the University of New South Wales in Australia have created two different types of qubits (Phosphorous atom and an Artificial atom) and by putting them into a tiny silicon (silicon 28) they were able to elliminate the magnetic noise that makes them prone to errors. Additionally, they stated that the Phosphorous atom has 99.99% accuracy which accounts for 1 error every 10,000 quantum operations [Muhonen2014]. Their qubits can remain in superposition for a total of 35 seconds which is considered a world record [D-Wave2016]. Moreover, to achieve long coherence qubits need not only to be isolated from the external world but to be kept in temperatures reaching the absolute zero. However, this isolation makes it difficult to control them without contributing additional noise [Kirsch2015].
IBM in 2017, introduced the definition of Quantum Volume. Quantum volume is a metric to measure how powerful a quantum computer is based on how many qubits it has, how good is the error correction on these qubits, and the number of operations that can be done in parallel. Increase in the number of qubit does not improve a quantum computer if the error rate is high. However, improving the error rate would result in a more powerful quantum computer [Quantum_volume].
IV Cryptosystems Vulnerable to Quantum Algorithms
This section discusses the impact of quantum algorithms on present cryptography and gives an introduction to Shor’s algorithm and Grover’s algorithm. Note that Shor’s algorithm explained in the following subsection makes the algorithms that rely on the difficulty of factorizing or computing discrete logarithms vulnerable.
Cryptography plays an important role in every electronic communication system today. For example the security of emails, passwords, financial transactions, or even electronic voting systems require the same security objectives such as confidentiality and integrity [Campagna2015]. Cryptography makes sure that only parties that have exchanged keys can read the encrypted message (also called authentic parties). Quantum computers threaten the main goal of every secure and authentic communication because they are able to do computations that classical (conventional) computers cannot. Consequently, quantum computers can break the cryptographic keys quickly by calculating or searching exhaustively all secret keys, allowing an eavesdropper to intercept the communication channel between authentic parties (sender/receiver). This task is considered to be computational infeasible by a conventional computer [Buchanan2016].