Summary
A Brownian bridge is a continuous-time stochastic process B(t) whose probability distribution is the conditional probability distribution of a standard Wiener process W(t) (a mathematical model of Brownian motion) subject to the condition (when standardized) that W(T) = 0, so that the process is pinned to the same value at both t = 0 and t = T. More precisely: The expected value of the bridge at any t in the interval [0,T] is zero, with variance , implying that the most uncertainty is in the middle of the bridge, with zero uncertainty at the nodes. The covariance of B(s) and B(t) is , or s(T − t)/T if s < t. The increments in a Brownian bridge are not independent. If W(t) is a standard Wiener process (i.e., for t ≥ 0, W(t) is normally distributed with expected value 0 and variance t, and the increments are stationary and independent), then is a Brownian bridge for t ∈ [0, T]. It is independent of W(T) Conversely, if B(t) is a Brownian bridge and Z is a standard normal random variable independent of B, then the process is a Wiener process for t ∈ [0, 1]. More generally, a Wiener process W(t) for t ∈ [0, T] can be decomposed into Another representation of the Brownian bridge based on the Brownian motion is, for t ∈ [0, T] Conversely, for t ∈ [0, ∞] The Brownian bridge may also be represented as a Fourier series with stochastic coefficients, as where are independent identically distributed standard normal random variables (see the Karhunen–Loève theorem). A Brownian bridge is the result of Donsker's theorem in the area of empirical processes. It is also used in the Kolmogorov–Smirnov test in the area of statistical inference. A standard Wiener process satisfies W(0) = 0 and is therefore "tied down" to the origin, but other points are not restricted. In a Brownian bridge process on the other hand, not only is B(0) = 0 but we also require that B(T) = 0, that is the process is "tied down" at t = T as well. Just as a literal bridge is supported by pylons at both ends, a Brownian Bridge is required to satisfy conditions at both ends of the interval [0,T].
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related publications (8)
Related concepts (6)
Donsker's theorem
In probability theory, Donsker's theorem (also known as Donsker's invariance principle, or the functional central limit theorem), named after Monroe D. Donsker, is a functional extension of the central limit theorem. Let be a sequence of independent and identically distributed (i.i.d.) random variables with mean 0 and variance 1. Let . The stochastic process is known as a random walk. Define the diffusively rescaled random walk (partial-sum process) by The central limit theorem asserts that converges in distribution to a standard Gaussian random variable as .
Empirical distribution function
In statistics, an empirical distribution function (commonly also called an empirical cumulative distribution function, eCDF) is the distribution function associated with the empirical measure of a sample. This cumulative distribution function is a step function that jumps up by 1/n at each of the n data points. Its value at any specified value of the measured variable is the fraction of observations of the measured variable that are less than or equal to the specified value.
Gaussian process
In probability theory and statistics, a Gaussian process is a stochastic process (a collection of random variables indexed by time or space), such that every finite collection of those random variables has a multivariate normal distribution, i.e. every finite linear combination of them is normally distributed. The distribution of a Gaussian process is the joint distribution of all those (infinitely many) random variables, and as such, it is a distribution over functions with a continuous domain, e.g.
Show more