Summary
In probability theory and statistics, a shape parameter (also known as form parameter) is a kind of numerical parameter of a parametric family of probability distributions that is neither a location parameter nor a scale parameter (nor a function of these, such as a rate parameter). Such a parameter must affect the shape of a distribution rather than simply shifting it (as a location parameter does) or stretching/shrinking it (as a scale parameter does). For example, "peakedness" refers to how round the main peak is. Many estimators measure location or scale; however, estimators for shape parameters also exist. Most simply, they can be estimated in terms of the higher moments, using the method of moments, as in the skewness (3rd moment) or kurtosis (4th moment), if the higher moments are defined and finite. Estimators of shape often involve higher-order statistics (non-linear functions of the data), as in the higher moments, but linear estimators also exist, such as the L-moments. Maximum likelihood estimation can also be used. The following continuous probability distributions have a shape parameter: Beta distribution Burr distribution Dagum distribution Erlang distribution ExGaussian distribution Exponential power distribution Fréchet distribution Gamma distribution Generalized extreme value distribution Log-logistic distribution Log-t distribution Inverse-gamma distribution Inverse Gaussian distribution Pareto distribution Pearson distribution Skew normal distribution Lognormal distribution Student's t-distribution Tukey lambda distribution Weibull distribution By contrast, the following continuous distributions do not have a shape parameter, so their shape is fixed and only their location or their scale or both can change. It follows that (where they exist) the skewness and kurtosis of these distribution are constants, as skewness and kurtosis are independent of location and scale parameters. Exponential distribution Cauchy distribution Logistic distribution Normal distribution Raised cosine distribution Uni
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (10)
MGT-581: Introduction to econometrics
The course provides an introduction to econometrics. The objective is to learn how to make valid (i.e., causal) inference from economic and social data. It explains the main estimators and present met
MATH-413: Statistics for data science
Statistics lies at the foundation of data science, providing a unifying theoretical and methodological backbone for the diverse tasks enountered in this emerging field. This course rigorously develops
FIN-417: Quantitative risk management
This course is an introduction to quantitative risk management that covers standard statistical methods, multivariate risk factor models, non-linear dependence structures (copula models), as well as p
Show more
Related lectures (33)
Exploration vs. Exploitation: Softmax Policy Quiz
Presents a quiz on the exploration vs. exploitation dilemma using the softmax policy.
Generalized Linear Models: GLMs for Non-Gaussian Data
Explores Generalized Linear Models for non-Gaussian data, covering interpretation of natural link function, MLE asymptotic normality, deviance measures, residuals, and logistic regression.
Convergence of Series
Covers the convergence criteria of series, including alternating series and absolute convergence.
Show more
Related publications (40)
Related concepts (22)
Pearson distribution
The Pearson distribution is a family of continuous probability distributions. It was first published by Karl Pearson in 1895 and subsequently extended by him in 1901 and 1916 in a series of articles on biostatistics. The Pearson system was originally devised in an effort to model visibly skewed observations. It was well known at the time how to adjust a theoretical model to fit the first two cumulants or moments of observed data: Any probability distribution can be extended straightforwardly to form a location-scale family.
Poisson distribution
In probability theory and statistics, the Poisson distribution is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and independently of the time since the last event. It is named after French mathematician Siméon Denis Poisson ('pwɑːsɒn; pwasɔ̃). The Poisson distribution can also be used for the number of events in other specified interval types such as distance, area, or volume.
Incomplete gamma function
In mathematics, the upper and lower incomplete gamma functions are types of special functions which arise as solutions to various mathematical problems such as certain integrals. Their respective names stem from their integral definitions, which are defined similarly to the gamma function but with different or "incomplete" integral limits. The gamma function is defined as an integral from zero to infinity. This contrasts with the lower incomplete gamma function, which is defined as an integral from zero to a variable upper limit.
Show more