In probability theory and statistics, the law of the unconscious statistician, or LOTUS, is a theorem which expresses the expected value of a function g(X) of a random variable X in terms of g and the probability distribution of X. The form of the law depends on the type of random variable X in question. If the distribution of X is discrete and one knows its probability mass function pX, then the expected value of g(X) is where the sum is over all possible values x of X. If instead the distribution of X is continuous with probability density function fX, then the expected value of g(X) is Both of these special cases can be expressed in terms of the cumulative probability distribution function FX of X, with the expected value of g(X) now given by the Lebesgue–Stieltjes integral In even greater generality, X could be a random element in any measurable space, in which case the law is given in terms of measure theory and the Lebesgue integral. In this setting, there is no need to restrict the context to probability measures, and the law becomes a general theorem of mathematical analysis on Lebesgue integration relative to a pushforward measure. This proposition is (sometimes) known as the law of the unconscious statistician because of a purported tendency to think of the identity as the very definition of the expected value, rather than (more formally) as a consequence of its true definition. The naming is sometimes attributed to Sheldon Ross' textbook Introduction to Probability Models, although he removed the reference in later editions. Many statistics textbooks do present the result as the definition of expected value. A similar property holds for joint distributions, or equivalently, for random vectors. For discrete random variables X and Y, a function of two variables g, and joint probability mass function : In the absolutely continuous case, with being the joint probability density function, A number of special cases are given here.
Daniel Kuhn, Wolfram Wiesemann