Lattice-based cryptography is the generic term for constructions of cryptographic primitives that involve lattices, either in the construction itself or in the security proof. Lattice-based constructions are currently important candidates for post-quantum cryptography. Unlike more widely used and known public-key schemes such as the RSA, Diffie-Hellman or elliptic-curve cryptosystems — which could, theoretically, be defeated using Shor's algorithm on a quantum computer — some lattice-based constructions appear to be resistant to attack by both classical and quantum computers. Furthermore, many lattice-based constructions are considered to be secure under the assumption that certain well-studied computational lattice problems cannot be solved efficiently.
In 1996, Miklós Ajtai introduced the first lattice-based cryptographic construction whose security could be based on the hardness of well-studied lattice problems, and Cynthia Dwork showed that a certain average-case lattice problem, known as Short Integer Solutions (SIS), is at least as hard to solve as a worst-case lattice problem. She then showed a cryptographic hash function whose security is equivalent to the computational hardness of SIS.
In 1998, Jeffrey Hoffstein, Jill Pipher, and Joseph H. Silverman introduced a lattice-based public-key encryption scheme, known as NTRU. However, their scheme is not known to be at least as hard as solving a worst-case lattice problem.
The first lattice-based public-key encryption scheme whose security was proven under worst-case hardness assumptions was introduced by Oded Regev in 2005, together with the Learning with Errors problem (LWE). Since then, much follow-up work has focused on improving Regev's security proof and improving the efficiency of the original scheme. Much more work has been devoted to constructing additional cryptographic primitives based on LWE and related problems. For example, in 2009, Craig Gentry introduced the first fully homomorphic encryption scheme, which was based on a lattice problem.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
The goal of the course is to introduce basic notions from public key cryptography (PKC) as well as basic number-theoretic methods and algorithms for cryptanalysis of protocols and schemes based on PKC
The course aims to introduce the basic concepts and results of integer optimization with special emphasis on algorithmic problems on lattices that have proved to be important in theoretical computer s
This advanced course will provide students with the knowledge to tackle the design of privacy-preserving ICT systems. Students will learn about existing technologies to prect privacy, and how to evalu
In cryptography, post-quantum cryptography (PQC) (sometimes referred to as quantum-proof, quantum-safe or quantum-resistant) refers to cryptographic algorithms (usually public-key algorithms) that are thought to be secure against a cryptanalytic attack by a quantum computer. The problem with currently popular algorithms is that their security relies on one of three hard mathematical problems: the integer factorization problem, the discrete logarithm problem or the elliptic-curve discrete logarithm problem.
In computer science, lattice problems are a class of optimization problems related to mathematical objects called lattices. The conjectured intractability of such problems is central to the construction of secure lattice-based cryptosystems: Lattice problems are an example of NP-hard problems which have been shown to be average-case hard, providing a test case for the security of cryptographic algorithms. In addition, some lattice problems which are worst-case hard can be used as a basis for extremely secure cryptographic schemes.
In cryptography, Learning with errors (LWE) is a mathematical problem that is widely used in cryptography to create secure encryption algorithms. It is based on the idea of representing secret information as a set of equations with errors. In other words, LWE is a way to hide the value of a secret by introducing noise to it. In more technical terms, it refers to the computational problem of inferring a linear -ary function over a finite ring from given samples some of which may be erroneous.
Since the advent of internet and mass communication, two public-key cryptographic algorithms have shared the monopoly of data encryption and authentication: Diffie-Hellman and RSA. However, in the last few years, progress made in quantum physics -- and mor ...
Current cryptographic solutions will become obsolete with the arrival of large-scale universal quantum computers. As a result, the National Institute of Standards and Technology supervises a post-quantum standardization process which involves evaluating ca ...
Isogeny-based cryptography is an instance of post-quantum cryptography whose fundamental problem consists of finding an isogeny between two (isogenous) elliptic curves E and E′. This problem is closely related to that of computing the endomorphism ring of ...