Quantum indeterminacy is the apparent necessary incompleteness in the description of a physical system, that has become one of the characteristics of the standard description of quantum physics. Prior to quantum physics, it was thought that
Quantum indeterminacy can be quantitatively characterized by a probability distribution on the set of outcomes of measurements of an observable. The distribution is uniquely determined by the system state, and moreover quantum mechanics provides a recipe for calculating this probability distribution.
Indeterminacy in measurement was not an innovation of quantum mechanics, since it had been established early on by experimentalists that errors in measurement may lead to indeterminate outcomes. By the later half of the 18th century, measurement errors were well understood, and it was known that they could either be reduced by better equipment or accounted for by statistical error models. In quantum mechanics, however, indeterminacy is of a much more fundamental nature, having nothing to do with errors or disturbance.
An adequate account of quantum indeterminacy requires a theory of measurement. Many theories have been proposed since the beginning of quantum mechanics and quantum measurement continues to be an active research area in both theoretical and experimental physics. Possibly the first systematic attempt at a mathematical theory was developed by John von Neumann. The kinds of measurements he investigated are now called projective measurements. That theory was based in turn on the theory of projection-valued measures for self-adjoint operators which had been recently developed (by von Neumann and independently by Marshall Stone) and the Hilbert space formulation of quantum mechanics (attributed by von Neumann to Paul Dirac).
In this formulation, the state of a physical system corresponds to a vector of length 1 in a Hilbert space H over the complex numbers. An observable is represented by a self-adjoint (i.e. Hermitian) operator A on H.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
In physics, hidden-variable theories are proposals to provide explanations of quantum mechanical phenomena through the introduction of (possibly unobservable) hypothetical entities. The existence of fundamental indeterminacy for some measurements is assumed as part of the mathematical formulation of quantum mechanics; moreover, bounds for indeterminacy can be expressed in a quantitative form by the Heisenberg uncertainty principle.
A Bell test, also known as Bell inequality test or Bell experiment, is a real-world physics experiment designed to test the theory of quantum mechanics in relation to Albert Einstein's concept of local realism. Named for John Stewart Bell, the experiments test whether or not the real world satisfies local realism, which requires the presence of some additional local variables (called "hidden" because they are not a feature of quantum theory) to explain the behavior of particles like photons and electrons.
Theoretical physics is a branch of physics that employs mathematical models and abstractions of physical objects and systems to rationalize, explain and predict natural phenomena. This is in contrast to experimental physics, which uses experimental tools to probe these phenomena. The advancement of science generally depends on the interplay between experimental studies and theory. In some cases, theoretical physics adheres to standards of mathematical rigour while giving little weight to experiments and observations.
This lecture describes advanced concepts and applications of quantum optics. It emphasizes the connection with ongoing research, and with the fast growing field of quantum technologies. The topics cov
Information is processed in physical devices. In the quantum regime the concept of classical bit is replaced by the quantum bit. We introduce quantum principles, and then quantum communications, key d
Quantum optics studies how photons interact with other forms of matter, the understanding of which was crucial for the development of quantum mechanics as a whole. Starting from the photoelectric effect, the quantum property of light has led to the develop ...
EPFL2024
Randomized measurement protocols such as classical shadows represent powerful resources for quantum technologies, with applications ranging from quantum state characterization and process tomography to machine learning and error mitigation. Recently, the n ...
The sequence-dependent statistical mechanics of double-stranded nucleic acid, or dsNA, is believed to be essential in its biological functions. In turn, the equilibrium statistical mechanics behaviour of dsNA depends strongly both on sequence-dependent per ...