In probability theory, regular conditional probability is a concept that formalizes the notion of conditioning on the outcome of a random variable. The resulting conditional probability distribution is a parametrized family of probability measures called a Markov kernel.
Consider two random variables . The conditional probability distribution of Y given X is a two variable function
If the random variable X is discrete
If the random variables X, Y are continuous with density .
A more general definition can be given in terms of conditional expectation. Consider a function satisfying
for almost all .
Then the conditional probability distribution is given by
As with conditional expectation, this can be further generalized to conditioning on a sigma algebra . In that case the conditional distribution is a function :
For working with , it is important that it be regular, that is:
For almost all x, is a probability measure
For all A, is a measurable function
In other words is a Markov kernel.
The second condition holds trivially, but the proof of the first is more involved. It can be shown that if Y is a random element in a Radon space S, there exists a that satisfies the first condition. It is possible to construct more general spaces where a regular conditional probability distribution does not exist.
For discrete and continuous random variables, the conditional expectation can be expressed as
where is the conditional density of Y given X.
This result can be extended to measure theoretical conditional expectation using the regular conditional probability distribution:
Let be a probability space, and let be a random variable, defined as a Borel-measurable function from to its state space .
One should think of as a way to "disintegrate" the sample space into .
Using the disintegration theorem from the measure theory, it allows us to "disintegrate" the measure into a collection of measures,
one for each . Formally, a regular conditional probability is defined as a function called a "transition probability", where:
For every , is a probability measure on .
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Beliefs depend on the available information. This idea is formalized in probability theory by conditioning. Conditional probabilities, conditional expectations, and conditional probability distributions are treated on three levels: discrete probabilities, probability density functions, and measure theory. Conditioning leads to a non-random result if the condition is completely specified; otherwise, if the condition is left random, the result of conditioning is also random.
In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value – the value it would take "on average" over an arbitrarily large number of occurrences – given that a certain set of "conditions" is known to occur. If the random variable can take on only a finite number of values, the "conditions" are that the variable can only take on a subset of those values.
Statistics lies at the foundation of data science, providing a unifying theoretical and methodological backbone for the diverse tasks enountered in this emerging field. This course rigorously develops
We introduce contextual stochastic bilevel optimization (CSBO) -- a stochastic bilevel optimization framework with the lower-level problem minimizing an expectation conditioned on some contextual information and the upper-level decision variable. This fram ...
Researchers are often interested in treatment effects on outcomes that are only defined conditional on a post-treatment event status. For example, in a study of the effect of different cancer treatments on quality of life at end of follow-up, the quality o ...
In distributionally robust optimization the probability distribution of the uncertain problem parameters is itself uncertain, and a fictitious adversary, e.g., nature, chooses the worst distribution from within a known ambiguity set. A common shortcoming o ...