**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# Quantum Cryptography : On the Security of the BB84 Key-Exchange Protocol

Abstract

In 1984, C.H. Bennet and G. Brassard proposed a new protocol aimed to solve the problem of symmetric cryptographic key exchange. This protocol was called BB84 after the name of its authors. While a traditional method would rely on public key cryptography (like RSA), the BB84 protocol takes benefit of the laws of quantum mechanics, like for example the fact that any quantum measurement can perturb the system. Traditional public key algorithms security often rely on a typical hard mathematical problem. It is well known for example that the ability to factorize easily any number would make the usage of RSA completely insecure. Quantum Key Exchange (QKE) protocols security cannot be proved in a similar way. In this work, we will try to give an overview of security proofs of quantum key exchange protocols, focusing on the BB84 protocol.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts

Loading

Related publications

Loading

Related concepts (23)

Public-key cryptography

Public-key cryptography, or asymmetric cryptography, is the field of cryptographic systems that use pairs of related keys. Each key pair consists of a public key and a corresponding private key. K

Security

Security is protection from, or resilience against, potential harm (or other unwanted coercion) caused by others, by restraining the freedom of others to act. Beneficiaries (technically referents) of

Computer security

Computer security, cyber security, digital security or information technology security (IT security) is the protection of computer systems and networks from attacks by malicious actors that may resu

Related publications (52)

Loading

Loading

Loading

Modern cryptography pushed forward the need of having provable security. Whereas ancient cryptography was only relying on heuristic assumptions and the secrecy of the designs, nowadays researchers try to make the security of schemes to rely on mathematical problems which are believed hard to solve. When doing these proofs, the capabilities of potential adversaries are modeled formally. For instance, the black-box model assumes that an adversary does not learn anything from the inner-state of a construction. While this assumption makes sense in some practical scenarios, it was shown that one can sometimes learn some information by other means, e.g., by timing how long the computation take. In this thesis, we focus on two different areas of cryptography. In both parts, we take first a theoretical point of view to obtain a result. We try then to adapt our results so that they are easily usable for implementers and for researchers working in practical cryptography. In the first part of this thesis, we take a look at post-quantum cryptography, i.e., at cryptographic primitives that are believed secure even in the case (reasonably big) quantum computers are built. We introduce HELEN, a new public-key cryptosystem based on the hardness of the learning from parity with noise problem (LPN). To make our results more concrete, we suggest some practical instances which make the system easily implementable. As stated above, the design of cryptographic primitives usually relies on some well-studied hard problems. However, to suggest concrete parameters for these primitives, one needs to know the precise complexity of algorithms solving the underlying hard problem. In this thesis, we focus on two recent hard-problems that became very popular in post-quantum cryptography: the learning with error (LWE) and the learning with rounding problem (LWR). We introduce a new algorithm that solves both problems and provide a careful complexity analysis so that these problems can be used to construct practical cryptographic primitives. In the second part, we look at leakage-resilient cryptography which studies adversaries able to get some side-channel information from a cryptographic primitive. In the past, two main disjoint models were considered. The first one, the threshold probing model, assumes that the adversary can put a limited number of probes in a circuit. He then learns all the values going through these probes. This model was used mostly by theoreticians as it allows very elegant and convenient proofs. The second model, the noisy-leakage model, assumes that every component of the circuit leaks but that the observed signal is noisy. Typically, some Gaussian noise is added to it. According to experiments, this model depicts closely the real behaviour of circuits. Hence, this model is cherished by the practical cryptographic community. In this thesis, we show that making a proof in the first model implies a proof in the second model which unifies the two models and reconciles both communities. We then look at this result with a more practical point-of-view. We show how it can help in the process of evaluating the security of a chip based solely on the more standard mutual information metric.

Since the late 70’s, several public key cryptographic algorithms have been proposed. Diﬃe and Hellman ﬁrst came with this concept in 1976. Since that time, several other public key cryptosystems were invented, such as the well known RSA, ElGamal or Rabin cryptosystems. Roughly, the scope of these algorithms is to allow the secure exchange of a secret key that will later on be used to encrypt a larger amount of data. All these algorithms share one particularity, namely that their security rely on some mathematical problem which is supposed to be hard (computationally speaking) to solve. For example, it is well known that the ability to factorize easily the product of two large primes without any indication about the primes, would lead to break the RSA cryptosystem. Since those early days, most of the assymetric encryption algorithms that have been proposed relied on the same hard mathematical problems. Yet, it is well accepted that we should not put al l the cryptographic eggs in one basket. This is the reason why Goldreich, Goldwasser, and Halevi proposed in 1997 (exactly 10 years after RSA) a new public-key cryptosystem which security was based on lattice reduction problems. This cryptographic schemes is known as the GGH cryptosystem. Unfortunately, only two years later, Nguyen proposed an attack against GGH which proved that in practice, GGH would not provide the security it originally claimed to have. The attack involved a so called lattice reduction algorithm, known as LLL. The aim of this work is to provide the necessary background to understand the GGH cryptosystem and the attack that goes with it. The basis reduction algorithm LLL has many other applications in cryptography. As an example, we will review a very recent attack against the GNU Privacy Guard software, which is a widely used free implementation of Zimmermann’s famous software. The necessary background about lattices, LLL, . . . will be recalled in sections 2 and 3. Section 4 will review the GGH cryptosystem and Nguyen’s attack against it. Finally, Section 5 will show how lattice reduction techniques can break the ElGamal digital signature scheme implementation in GPGv1.2.3.

2006Our main motivation is to design more user-friendly security protocols. Indeed, if the use of the protocol is tedious, most users will not behave correctly and, consequently, security issues occur. An example is the actual behavior of a user in front of an SSH certificate validation: while this task is of utmost importance, about 99% of SSH users accept the received certificate without checking it. Designing more user-friendly protocols may be difficult since the security should not decrease at the same time. Interestingly, insecure channels coexist with channels ensuring authentication. In practice, these latters may be used for a string comparison or a string copy, e.g., by voice over IP spelling. The shorter the authenticated string is, the less human interaction the protocol requires, and the more user-friendly the protocol is. This leads to the notion of SAS-based cryptography, where SAS stands for Short Authenticated String. In the first part of this thesis, we analyze and propose optimal SAS-based message authentication protocols. By using these protocols, we show how to construct optimal SAS-based authenticated key agreements. Such a protocol enables any group of users to agree on a shared secret key. SAS-based cryptography requires no pre-shared key, no trusted third party, and no public-key infrastructure. However, it requires the user to exchange a short SAS, e.g., five decimal digits. By using the just agreed secret key, the group can now achieve a secure communication based on symmetric cryptography. SAS-based authentication protocols are often used to authenticate the protocol messages of a key agreement. Hence, each new secure communication requires the interaction of the users to agree on the SAS. A solution to reduce the user interaction is to use digital signature schemes. Indeed, in a setup phase, the users can use a SAS-based authentication protocol to exchange long-term verification keys. Then, using digital signatures, users are able to run several key agreements and the authentication of protocol messages is done by digital signatures. In the case where no authenticated channel is available, but a public-key infrastructure is in place, the SAS-based setup phase is avoided since verification keys are already authenticated by the infrastructure. In the second part of this thesis, we also study two problems related to digital signatures: (1) the insecurity of digital signature schemes which use weak hash functions and (2) the privacy issues from signed documents. Digital signatures are often proven to be secure in the random oracle model. The role of random oracles is to model ideal hash functions. However, real hash functions deviate more and more from this idealization. Indeed, weaknesses on hash functions have already been discovered and we are expecting new ones. A question is how to fix the existing signature constructions based on these weak hash functions. In this thesis, we first try to find a better way to model weak hash function. Then, we propose a (randomized) pre-processing to the input message which transforms any weak signature implementation into a strong signature scheme. There remains one drawback due to the randomization. Indeed, the random coins must be sent and thus the signature enlarges. We also propose a method to avoid the increase in signature length by reusing signing coins. Digital signatures may also lead to privacy issues. Indeed, given a message and its signature, anyone can publish the pair which will confirm the authenticity of the message. In certain applications, like in electronic passports (e-passports), publishing the authenticated data leads to serious privacy issues. In this thesis, we define the required security properties in order to protect the data privacy, especially in the case of e-passport verification. The main idea consists for the e-passport to keep the signature secret. The e-passport should only prove that it knows a valid signature instead of revealing it. We propose a new primitive, called Offline Non-Transferable Authentication Protocol (ONTAP), as well as efficient implementations that are compatible with the e-passport standard signature schemes.