**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Concept# Stationary distribution

Summary

Stationary distribution may refer to:
A special distribution for a Markov chain such that if the chain starts with its stationary distribution, the marginal distribution of all states at any time will always be the stationary distribution. Assuming irreducibility, the stationary distribution is always unique if it exists, and its existence can be implied by positive recurrence of all states. The stationary distribution has the interpretation of the limiting distribution when the chain is irreducible and aperiodic.
The marginal distribution of a stationary process or stationary time series
The set of joint probability distributions of a stationary process or stationary time series
In some fields of application, the term stable distribution is used for the equivalent of a stationary (marginal) distribution, although in probability and statistics the term has a rather different meaning: see stable distribution.
Crudely stated, all of the above are specific cases of a common general concept. A stationary distribution is a specific entity which is unchanged by the effect of some matrix or operator: it need not be unique. Thus stationary distributions are related to eigenvectors for which the eigenvalue is unity.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications

Loading

Related people (1)

Related units

No results

Related concepts

Loading

Related courses

Loading

Related lectures

Loading

Related MOOCs

Loading

Related publications (1)

Related concepts (2)

Related courses (4)

Related lectures (45)

Related MOOCs

MATH-332: Stochastic processes

The course follows the text of Norris and the polycopie (which will be distributed chapter by chapter).

COM-300: Stochastic models in communication

L'objectif de ce cours est la maitrise des outils des processus stochastiques utiles pour un ingénieur travaillant dans les domaines des systèmes de communication, de la science des données et de l'i

COM-516: Markov chains and algorithmic applications

The study of random walks finds many applications in computer science and communications. The goal of the course is to get familiar with the theory of random walks, and to get an overview of some appl

Ergodic Theory: Markov Chains

Explores ergodic theory in Markov chains, discussing irreducibility and unique stationary distributions.

Continuous-Time Markov Chains: Asymptotic Behavior

Explores the asymptotic behavior of continuous-time Markov chains and their convergence properties.

Coupling of Markov Chains: Ergodic Theorem

Explores the coupling of Markov chains and the proof of the ergodic theorem, emphasizing distribution convergence and chain properties.

Stationary distribution

Stationary distribution may refer to: A special distribution for a Markov chain such that if the chain starts with its stationary distribution, the marginal distribution of all states at any time will always be the stationary distribution. Assuming irreducibility, the stationary distribution is always unique if it exists, and its existence can be implied by positive recurrence of all states. The stationary distribution has the interpretation of the limiting distribution when the chain is irreducible and aperiodic.

Markov chain

A Markov chain or Markov process is a stochastic model describing a sequence of possible events in which the probability of each event depends only on the state attained in the previous event. Informally, this may be thought of as, "What happens next depends only on the state of affairs now." A countably infinite sequence, in which the chain moves state at discrete time steps, gives a discrete-time Markov chain (DTMC). A continuous-time process is called a continuous-time Markov chain (CTMC).

No results

Loading