Quantcast

Towards a Post-Quantum Cryptography

The quantum computing revolution will make many concepts and devices obsolete, thereby generating certain security problems. The National Institute of Standards and Technology in the US has organised an international competition to establish new cryptographic principles. The researcher Adeline Roux-Langlois sheds light on the issues involved.

In what ways do quantum technologies pose a threat to cryptography?
Adeline Roux-Langlois:1
 Cryptography is based on mathematical problems that are extremely difficult for conventional computers to solve or avoid. However, the quantum machines of the future will be able to do so more easily, making our protection systems obsolete. For now, quantum computers are not powerful or advanced enough to defeat today’s cryptographic protocols, but it is important to prepare for them.

The US National Institute of Standards and Technology (NIST), which is in charge of establishing various technological and measurement standards in the United States, launched an international competition in 2017 to build scientific consensus regarding post-quantum cryptography. This process has entered its third and final phase, with both academic and industrial researchers contributing to the effort. Among the sixty-nine initial submissions, the NIST selected those that would make it to the following stage of the competition based on criteria such as security, performance, and the characteristics of the implementation. It also took into consideration studies published by the scientific community, in addition to possible attacks against each scheme.

What is cryptography currently based on?
A. R.-L.: 
There are two approaches for encrypting data, private-key encryption and public-key encryption. In private-key encryption, users share a key. This approach is more secure and less vulnerable to quantum technology, but it is also less practical to use in many cases. The public-key encryption system is based on two keys, one that is kept secret, and another that is available to all. For example, everyone can send encrypted emails to a recipient, who is the only one able to read them. It is nevertheless important to be confident that the problem from which the keys are calculated is sufficiently complex, as any algorithm that can solve it in a reasonable amount of time will provide access to protected data. Ensuring that riddles are difficult enough is the very foundation of security.

Today there are two major types of hard problems, factorisation and discrete logarithm. Factorisation involves decomposing a number into a product of two prime numbers, which is much more tricky than it seems when dealing with very large numbers. Similarly, for the time being no algorithm can effectively calculate a discrete logarithm. The NIST competition is not just limited to encryption. Other algorithms will have to analyse the signature, in other words authenticate the source of a message without being susceptible to falsification. In both cases the criteria clearly include security, but also the system’s speed and fluidity.

What approaches have been proposed for post-quantum cryptography?
A. R.-L.: Various avenues are being pursued. One of them, cryptographic Euclidean networks, involves finding the shortest vectors between two points on a mesh, in a space with hundreds of dimensions. Each vector therefore has a huge number of coordinates, and the problem becomes extremely arduous to solve. Another solution, multivariate cryptography, is based on a somewhat similar principle by proposing to solve polynomials with so many variables that it is no longer possible to calculate within a reasonable time frame. Another approach relies on error-correcting codes, which are used to improve degraded communications, for instance restoring the appearance of a video burned on a DVD that has been damaged. Some of these codes provide a very effective framework for encryption, but function quite poorly when it comes to verifying a signature.

The construction of Euclidean networks is quite present among the finalists, as it works equally well for encryption and signatures. However, everyone in the community does not agree that they should be the focus of attention. The NIST prefers exploring a broad spectrum of approaches for establishing its standards. This way, if a particular solution is attacked, the others will remain secure. For example, the SPHINCS+ algorithm from the Eindhoven University of Technology in the Netherlands, is based on hash functions. A digital fingerprint is ascribed to data, an operation that is extremely difficult to perform in the opposite direction, even using a quantum algorithm. Still, signatures obtained in this way are resource-intensive.

Which French submissions are still in the running for the competition?
A. R.-L.: There are seven algorithms involving researchers based in France. With regard to encryption, Crystals-Kyber, NTRU, and Saber are based on Euclidean networks, while Classic McEliece rely on error-correcting codes. Damien Stehlé, a professor at ENS Lyon and a member of the LIP Computer Science Laboratory,2 and Nicolas Sendrier, from the INRIA research centre in Paris, are taking part in the competition. In the signature category, the Crystals-Dilithium and Falcon algorithms use Euclidean networks, and Rainbow opts for multivariate systems. Stehlé is once again part of the team, as are Pierre-Alain Fouque, a professor at Rennes 1 and member of the IRISA laboratory, as well as Jacques Patarin, a professor at the UVSQ (Université de Versailles-Saint-Quentin-en-Yvelines).

What research are you conducting in these areas?
A. R.-L.: I focus on cryptography using Euclidean networks, in which I provide theoretical proof that the security of cryptographic constructions is based on problems that are hard enough to be reliable. Encryption and signature are the first applications that come to mind, but it could also be used to ensure the very particular confidentiality of electronic voting, for which the authenticity of votes must be verified before counting, while not revealing who voted for whom. I am also working on anonymous authentication, which for example enables individuals to prove that they belong to a group without disclosing other information, or that they are adults without giving their age or date of birth.

Footnotes

  • 1.CNRS Researcher at the IRISA (Institut de Recherche en Informatique et Systèmes Aléatoires – CNRS / Université de Rennes 1 / ENS Rennes / INSA Rennes / Université de Bretagne-Sud / INRIA / IMT Atlantique).
  • 2.Laboratoire de l’informatique du parallélisme (CNRS / ENS Lyon / Université Claude Bernard Lyon 1).



The material in this press release comes from the originating research organization. Content may be edited for style and length. Want more? Sign up for our daily email.