Summary
In probability theory, the conditional expectation, conditional expected value, or conditional mean of a random variable is its expected value – the value it would take "on average" over an arbitrarily large number of occurrences – given that a certain set of "conditions" is known to occur. If the random variable can take on only a finite number of values, the "conditions" are that the variable can only take on a subset of those values. More formally, in the case when the random variable is defined over a discrete probability space, the "conditions" are a partition of this probability space. Depending on the context, the conditional expectation can be either a random variable or a function. The random variable is denoted analogously to conditional probability. The function form is either denoted or a separate function symbol such as is introduced with the meaning . Consider the roll of a fair and let A = 1 if the number is even (i.e., 2, 4, or 6) and A = 0 otherwise. Furthermore, let B = 1 if the number is prime (i.e., 2, 3, or 5) and B = 0 otherwise. The unconditional expectation of A is , but the expectation of A conditional on B = 1 (i.e., conditional on the die roll being 2, 3, or 5) is , and the expectation of A conditional on B = 0 (i.e., conditional on the die roll being 1, 4, or 6) is . Likewise, the expectation of B conditional on A = 1 is , and the expectation of B conditional on A = 0 is . Suppose we have daily rainfall data (mm of rain each day) collected by a weather station on every day of the ten–year (3652-day) period from January 1, 1990, to December 31, 1999. The unconditional expectation of rainfall for an unspecified day is the average of the rainfall amounts for those 3652 days. The conditional expectation of rainfall for an otherwise unspecified day known to be (conditional on being) in the month of March, is the average of daily rainfall over all 310 days of the ten–year period that falls in March. And the conditional expectation of rainfall conditional on days dated March 2 is the average of the rainfall amounts that occurred on the ten days with that specific date.
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (32)
MATH-230: Probability
Le cours est une introduction à la théorie des probabilités. Le but sera d'introduire le formalisme moderne (basé sur la notion de mesure) et de lier celui-ci à l'aspect "intuitif" des probabilités.
MATH-330: Martingales et mouvement brownien
Introduction à la théorie des martingales à temps discret, en particulier aux théorèmes de convergence et d'arrêt. Application aux processus de branchement. Introduction au mouvement brownien et étude
COM-417: Advanced probability and applications
In this course, various aspects of probability theory are considered. The first part is devoted to the main theorems in the field (law of large numbers, central limit theorem, concentration inequaliti
Show more
Related lectures (120)
Conditional Expectation: Properties and Examples
Explores conditional expectation properties, variance, and examples in practical applications.
Conditional Probability
Covers conditional probability, probability distributions, and choosing the right distribution for specific scenarios.
Conditional Expectation: Properties & Jensen's Inequality
Covers the properties of conditional expectation and Jensen's inequality in probability theory.
Show more
Related publications (52)

An extension of the stochastic sewing lemma and applications to fractional stochastic calculus

Toyomu Matsuda

We give an extension of Le's stochastic sewing lemma. The stochastic sewing lemma proves convergence in LmL_m of Riemann type sums [s,t]πAs,t\sum _{[s,t] \in \pi } A_{s,t} for an adapted two-parameter stochastic process A, under certain conditions on the moments o ...
Cambridge Univ Press2024
Show more
Related concepts (18)
Chebyshev's inequality
In probability theory, Chebyshev's inequality (also called the Bienaymé–Chebyshev inequality) guarantees that, for a wide class of probability distributions, no more than a certain fraction of values can be more than a certain distance from the mean. Specifically, no more than 1/k2 of the distribution's values can be k or more standard deviations away from the mean (or equivalently, at least 1 − 1/k2 of the distribution's values are less than k standard deviations away from the mean).
Jensen's inequality
In mathematics, Jensen's inequality, named after the Danish mathematician Johan Jensen, relates the value of a convex function of an integral to the integral of the convex function. It was proved by Jensen in 1906, building on an earlier proof of the same inequality for doubly-differentiable functions by Otto Hölder in 1889. Given its generality, the inequality appears in many forms depending on the context, some of which are presented below.
Law of total probability
In probability theory, the law (or formula) of total probability is a fundamental rule relating marginal probabilities to conditional probabilities. It expresses the total probability of an outcome which can be realized via several distinct events, hence the name. The law of total probability is a theorem that states, in its discrete case, if is a finite or countably infinite partition of a sample space (in other words, a set of pairwise disjoint events whose union is the entire sample space) and each event is measurable, then for any event of the same sample space: or, alternatively, where, for any for which these terms are simply omitted from the summation, because is finite.
Show more