In theoretical computer science and mathematics, the theory of computation is the branch that deals with what problems can be solved on a model of computation, using an algorithm, how efficiently they can be solved or to what degree (e.g., approximate solutions versus precise ones). The field is divided into three major branches: automata theory and formal languages, computability theory, and computational complexity theory, which are linked by the question: "What are the fundamental capabilities and limitations of computers?".
In order to perform a rigorous study of computation, computer scientists work with a mathematical abstraction of computers called a model of computation. There are several models in use, but the most commonly examined is the Turing machine. Computer scientists study the Turing machine because it is simple to formulate, can be analyzed and used to prove results, and because it represents what many consider the most powerful possible "reasonable" model of computation (see Church–Turing thesis). It might seem that the potentially infinite memory capacity is an unrealizable attribute, but any decidable problem solved by a Turing machine will always require only a finite amount of memory. So in principle, any problem that can be solved (decided) by a Turing machine can be solved by a computer that has a finite amount of memory.
The theory of computation can be considered the creation of models of all kinds in the field of computer science. Therefore, mathematics and logic are used. In the last century, it became an independent academic discipline and was separated from mathematics.
Some pioneers of the theory of computation were Ramon Llull, Alonzo Church, Kurt Gödel, Alan Turing, Stephen Kleene, Rózsa Péter, John von Neumann and Claude Shannon.
Automata theory
Automata theory is the study of abstract machines (or more appropriately, abstract 'mathematical' machines or systems) and the computational problems that can be solved using these machines. These abstract machines are called automata.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
The Bell's states or EPR pairs are specific quantum states of two qubits that represent the simplest examples of quantum entanglement; conceptually, they fall under the study of quantum information science. The Bell's states are a form of entangled and normalized basis vectors. This normalization implies that the overall probability of the particle being in one of the mentioned states is 1: . Entanglement is a basis-independent result of superposition.
In quantum information theory, a quantum channel is a communication channel which can transmit quantum information, as well as classical information. An example of quantum information is the state of a qubit. An example of classical information is a text document transmitted over the Internet. More formally, quantum channels are completely positive (CP) trace-preserving maps between spaces of operators. In other words, a quantum channel is just a quantum operation viewed not merely as the reduced dynamics of a system but as a pipeline intended to carry quantum information.
In mathematics, particularly in functional analysis, a projection-valued measure (PVM) is a function defined on certain subsets of a fixed set and whose values are self-adjoint projections on a fixed Hilbert space. Projection-valued measures are formally similar to real-valued measures, except that their values are self-adjoint projections rather than real numbers. As in the case of ordinary measures, it is possible to integrate complex-valued functions with respect to a PVM; the result of such an integration is a linear operator on the given Hilbert space.
A quantum computer is a computer that exploits quantum mechanical phenomena. At small scales, physical matter exhibits properties of both particles and waves, and quantum computing leverages this behavior, specifically quantum superposition and entanglement, using specialized hardware that supports the preparation and manipulation of quantum states. Classical physics cannot explain the operation of these quantum devices, and a scalable quantum computer could perform some calculations exponentially faster than any modern "classical" computer.
In theoretical computer science and mathematics, computational complexity theory focuses on classifying computational problems according to their resource usage, and relating these classes to each other. A computational problem is a task solved by a computer. A computation problem is solvable by mechanical application of mathematical steps, such as an algorithm. A problem is regarded as inherently difficult if its solution requires significant resources, whatever the algorithm used.
Computability theory, also known as recursion theory, is a branch of mathematical logic, computer science, and the theory of computation that originated in the 1930s with the study of computable functions and Turing degrees. The field has since expanded to include the study of generalized computability and definability. In these areas, computability theory overlaps with proof theory and effective descriptive set theory.
After introducing the foundations of classical and quantum information theory, and quantum measurement, the course will address the theory and practice of digital quantum computing, covering fundament
The course introduces the paradigm of quantum computation in an axiomatic way. We introduce the notion of quantum bit, gates, circuits and we treat the most important quantum algorithms. We also touch
Information is processed in physical devices. In the quantum regime the concept of classical bit is replaced by the quantum bit. We introduce quantum principles, and then quantum communications, key d
Active in photonics, silicon nitride and integrated circuits. Ligentec is a B2B company manufacturing advanced Photonic Integrated Circuits with low loss and high efficiency, enabling applications in communication, Quantum technologies, LiDAR, and Biosensors.
Active in photonic sensing, quantum computing and fiber optic sensors. Miraex specializes in photonic sensing and quantum computing solutions, offering safe and efficient monitoring of critical assets and processes using light-based fiber optic sensors.
Adaptive signal processing, A/D and D/A. This module provides the basic
tools for adaptive filtering and a solid mathematical framework for sampling and
quantization
Covers the concept of quantum computation delegation and the relationship between MIP and RE, addressing common FAQs and discussing helpful materials and interactions with quantum devices.
In this thesis, we give new protocols that offer a quantum advantage for problems in ML, Physics, and Finance.Quantum mechanics gives predictions that are inconsistent with local realism.The experiment proving this fact (Bell, 1964) gives a quantum protoco ...
Quantum computers have the potential to surpass conventional computing, but they are hindered by noise which induces errors that ultimately lead to the loss of quantum information. This necessitates the development of quantum error correction strategies fo ...
EPFL2024
Quantum computing not only holds the potential to solve long-standing problems in quantum physics, but also to offer speed-ups across a broad spectrum of other fields. Access to a computational space that incorporates quantum effects, such as superposition ...