In probability theory, the theory of large deviations concerns the asymptotic behaviour of remote tails of sequences of probability distributions. While some basic ideas of the theory can be traced to Laplace, the formalization started with insurance mathematics, namely ruin theory with Cramér and Lundberg. A unified formalization of large deviation theory was developed in 1966, in a paper by Varadhan. Large deviations theory formalizes the heuristic ideas of concentration of measures and widely generalizes the notion of convergence of probability measures.
Roughly speaking, large deviations theory concerns itself with the exponential decline of the probability measures of certain kinds of extreme or tail events.
Consider a sequence of independent tosses of a fair coin. The possible outcomes could be heads or tails. Let us denote the possible outcome of the i-th trial by , where we encode head as 1 and tail as 0. Now let denote the mean value after trials, namely
Then lies between 0 and 1. From the law of large numbers it follows that as N grows, the distribution of converges to (the expected value of a single coin toss).
Moreover, by the central limit theorem, it follows that is approximately normally distributed for large . The central limit theorem can provide more detailed information about the behavior of than the law of large numbers. For example, we can approximately find a tail probability of , , that is greater than , for a fixed value of . However, the approximation by the central limit theorem may not be accurate if is far from unless is sufficiently large. Also, it does not provide information about the convergence of the tail probabilities as . However, the large deviation theory can provide answers for such problems.
Let us make this statement more precise. For a given value , let us compute the tail probability . Define
Note that the function is a convex, nonnegative function that is zero at and increases as approaches .
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
In mathematics — specifically, in large deviations theory — a rate function is a function used to quantify the probabilities of rare events. Such functions are used to formulate large deviation principle. A large deviation principle quantifies the asymptotic probability of rare events for a sequence of probabilities. A rate function is also called a Cramér function, after the Swedish probabilist Harald Cramér. Rate function An extended real-valued function I : X → [0, +∞] defined on a Hausdorff topological space X is said to be a rate function if it is not identically +∞ and is lower semi-continuous, i.
Explores the Large Deviations Principle, focusing on exponential tail decay and Laplace transform analysis.
Covers the Likelihood Ratio Test and hypothesis testing methods using Maximum Likelihood Estimators.
Explores quantum mechanics, emphasizing particles, interactions, spin, and wave functions.
In this work we consider solutions to stochastic partial differential equations with transport noise, which are known to converge, in a suitable scaling limit, to solution of the corresponding deterministic PDE with an additional viscosity term. Large devi ...
In this paper we investigate pointed (q, g, n)-Boltzmann loop-decorated maps with loops traversing only inner triangular faces. Using peeling exploration Budd (2018) modified to this setting we show that its law in the non-generic critical phase can be cod ...
JET experiments using the fuel mixture envisaged for fusion power plants, deuterium and tritium (D-T), provide a unique opportunity to validate existing D-T fusion power prediction capabilities in support of future device design and operation preparation. ...