Summary
In probability theory and statistics, the cumulants κn of a probability distribution are a set of quantities that provide an alternative to the moments of the distribution. Any two probability distributions whose moments are identical will have identical cumulants as well, and vice versa. The first cumulant is the mean, the second cumulant is the variance, and the third cumulant is the same as the third central moment. But fourth and higher-order cumulants are not equal to central moments. In some cases theoretical treatments of problems in terms of cumulants are simpler than those using moments. In particular, when two or more random variables are statistically independent, the n-th-order cumulant of their sum is equal to the sum of their n-th-order cumulants. As well, the third and higher-order cumulants of a normal distribution are zero, and it is the only distribution with this property. Just as for moments, where joint moments are used for collections of random variables, it is possible to define joint cumulants. The cumulants of a random variable X are defined using the cumulant-generating function K(t), which is the natural logarithm of the moment-generating function: The cumulants κn are obtained from a power series expansion of the cumulant generating function: This expansion is a Maclaurin series, so the n-th cumulant can be obtained by differentiating the above expansion n times and evaluating the result at zero: If the moment-generating function does not exist, the cumulants can be defined in terms of the relationship between cumulants and moments discussed later. Some writers prefer to define the cumulant-generating function as the natural logarithm of the characteristic function, which is sometimes also called the second characteristic function, An advantage of H(t)—in some sense the function K(t) evaluated for purely imaginary arguments—is that E[eitX] is well defined for all real values of t even when E[etX] is not well defined for all real values of t, such as can occur when there is "too much" probability that X has a large magnitude.
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (9)
PHYS-316: Statistical physics II
Introduction à la théorie des transitions de phase
CS-101: Advanced information, computation, communication I
Discrete mathematics is a discipline with applications to almost all areas of study. It provides a set of indispensable tools to computer science in particular. This course reviews (familiar) topics a
MATH-413: Statistics for data science
Statistics lies at the foundation of data science, providing a unifying theoretical and methodological backbone for the diverse tasks enountered in this emerging field. This course rigorously develops
Show more
Related lectures (41)
Estimating R: Moments and Cumulants
Explores the estimation of R using moments and cumulants in extreme value analysis.
Generating Functions: Moments and Cumulants
Explores generating functions for moments and cumulants, showcasing their role in distribution analysis.
Moment Generating Function and Multivariate Normal Distribution
Explores moment generating functions and multivariate normal distributions in probability and statistics.
Show more
Related publications (53)
Related concepts (25)
Poisson distribution
In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and independently of the time since the last event. It is named after French mathematician Siméon Denis Poisson ('pwɑːsɒn; pwasɔ̃). The Poisson distribution can also be used for the number of events in other specified interval types such as distance, area, or volume.
Eric Temple Bell
Eric Temple Bell (7 February 1883 – 21 December 1960) was a Scottish-born mathematician and science fiction writer who lived in the United States for most of his life. He published non-fiction using his given name and fiction as John Taine. Eric Temple Bell was born in Peterhead, Aberdeen, Scotland as third of three children to Helen Jane Lyall and James Bell Jr. His father, a factor, relocated to San Jose, California, in 1884, when Eric was fifteen months old. After his father died on 4 January 1896, the family returned to Bedford, England.
Moment-generating function
In probability theory and statistics, the moment-generating function of a real-valued random variable is an alternative specification of its probability distribution. Thus, it provides the basis of an alternative route to analytical results compared with working directly with probability density functions or cumulative distribution functions. There are particularly simple results for the moment-generating functions of distributions defined by the weighted sums of random variables.
Show more