L-momentIn statistics, L-moments are a sequence of statistics used to summarize the shape of a probability distribution. They are linear combinations of order statistics (L-statistics) analogous to conventional moments, and can be used to calculate quantities analogous to standard deviation, skewness and kurtosis, termed the L-scale, L-skewness and L-kurtosis respectively (the L-mean is identical to the conventional mean). Standardised L-moments are called L-moment ratios and are analogous to standardized moments.
Normal distributionIn statistics, a normal distribution or Gaussian distribution is a type of continuous probability distribution for a real-valued random variable. The general form of its probability density function is The parameter is the mean or expectation of the distribution (and also its median and mode), while the parameter is its standard deviation. The variance of the distribution is . A random variable with a Gaussian distribution is said to be normally distributed, and is called a normal deviate.
Electrodermal activityElectrodermal activity (EDA) is the property of the human body that causes continuous variation in the electrical characteristics of the skin. Historically, EDA has also been known as skin conductance, galvanic skin response (GSR), electrodermal response (EDR), psychogalvanic reflex (PGR), skin conductance response (SCR), sympathetic skin response (SSR) and skin conductance level (SCL). The long history of research into the active and passive electrical properties of the skin by a variety of disciplines has resulted in an excess of names, now standardized to electrodermal activity (EDA).
Plancherel theoremIn mathematics, the Plancherel theorem (sometimes called the Parseval–Plancherel identity) is a result in harmonic analysis, proven by Michel Plancherel in 1910. It states that the integral of a function's squared modulus is equal to the integral of the squared modulus of its frequency spectrum. That is, if is a function on the real line, and is its frequency spectrum, then A more precise formulation is that if a function is in both Lp spaces and , then its Fourier transform is in , and the Fourier transform map is an isometry with respect to the L2 norm.
Random measureIn probability theory, a random measure is a measure-valued random element. Random measures are for example used in the theory of random processes, where they form many important point processes such as Poisson point processes and Cox processes. Random measures can be defined as transition kernels or as random elements. Both definitions are equivalent. For the definitions, let be a separable complete metric space and let be its Borel -algebra. (The most common example of a separable complete metric space is ) A random measure is a (a.
Ellsberg paradoxIn decision theory, the Ellsberg paradox (or Ellsberg's paradox) is a paradox in which people's decisions are inconsistent with subjective expected utility theory. Daniel Ellsberg popularized the paradox in his 1961 paper, "Risk, Ambiguity, and the Savage Axioms". John Maynard Keynes published a version of the paradox in 1921. It is generally taken to be evidence of ambiguity aversion, in which a person tends to prefer choices with quantifiable risks over those with unknown, incalculable risks.