The generalized normal distribution or generalized Gaussian distribution (GGD) is either of two families of parametric continuous probability distributions on the real line. Both families add a shape parameter to the normal distribution. To distinguish the two families, they are referred to below as "symmetric" and "asymmetric"; however, this is not a standard nomenclature.
The symmetric generalized normal distribution, also known as the exponential power distribution or the generalized error distribution, is a parametric family of symmetric distributions. It includes all normal and Laplace distributions, and as limiting cases it includes all continuous uniform distributions on bounded intervals of the real line.
This family includes the normal distribution when (with mean and variance ) and it includes the Laplace distribution when . As , the density converges pointwise to a uniform density on .
This family allows for tails that are either heavier than normal (when ) or lighter than normal (when ). It is a useful way to parametrize a continuum of symmetric, platykurtic densities spanning from the normal () to the uniform density (), and a continuum of symmetric, leptokurtic densities spanning from the Laplace () to the normal density ().
The shape parameter also controls the peakedness in addition to the tails.
Parameter estimation via maximum likelihood and the method of moments has been studied. The estimates do not have a closed form and must be obtained numerically. Estimators that do not require numerical calculation have also been proposed.
The generalized normal log-likelihood function has infinitely many continuous derivates (i.e. it belongs to the class C∞ of smooth functions) only if is a positive, even integer. Otherwise, the function has continuous derivatives. As a result, the standard results for consistency and asymptotic normality of maximum likelihood estimates of only apply when .
It is possible to fit the generalized normal distribution adopting an approximate maximum likelihood method.
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
In probability and business statistics, the Bates distribution, named after Grace Bates, is a probability distribution of the mean of a number of statistically independent uniformly distributed random variables on the unit interval. This distribution is related to the uniform, the triangular, and the normal Gaussian distribution, and has applications in broadcast engineering for signal enhancement. The Bates distribution is sometimes confused with the Irwin–Hall distribution, which is the distribution of the sum (not the mean) of n independent random variables uniformly distributed from 0 to 1.
In probability theory, the stable count distribution is the conjugate prior of a one-sided stable distribution. This distribution was discovered by Stephen Lihn (Chinese: 藺鴻圖) in his 2017 study of daily distributions of the S&P 500 and the VIX. The stable distribution family is also sometimes referred to as the Lévy alpha-stable distribution, after Paul Lévy, the first mathematician to have studied it. Of the three parameters defining the distribution, the stability parameter is most important.
In mathematics, the upper and lower incomplete gamma functions are types of special functions which arise as solutions to various mathematical problems such as certain integrals. Their respective names stem from their integral definitions, which are defined similarly to the gamma function but with different or "incomplete" integral limits. The gamma function is defined as an integral from zero to infinity. This contrasts with the lower incomplete gamma function, which is defined as an integral from zero to a variable upper limit.
This course explores how to design reliable discriminative and generative neural networks, the ethics of data acquisition and model deployment, as well as modern multi-modal models.
We cover the theory and applications of sparse stochastic processes (SSP). SSP are solutions of differential equations driven by non-Gaussian innovations. They admit a parsimonious representation in a
Covers the definition of multivariate Gaussian distribution and its properties, including moment generating function and linear combinations of variables.
Explains how q = rw defines a Poisson process and its intensity.
, ,
Outliers in discrete choice response data may result from misclassification and misreporting of the response variable and from choice behaviour that is inconsistent with modelling assumptions (e.g. random utility maximisation). In the presence of outliers, ...
Despite the widespread empirical success of ResNet, the generalization properties of deep ResNet are rarely explored beyond the lazy training regime. In this work, we investigate scaled ResNet in the limit of infinitely deep and wide neural networks, of wh ...
Gels made of telechelic polymers connected by reversible cross-linkers are a versatile design platform for biocompatible viscoelastic materials. Their linear response to a step strain displays a fast, near-exponential relaxation when using low-valence cros ...