Summary
In statistics, the method of estimating equations is a way of specifying how the parameters of a statistical model should be estimated. This can be thought of as a generalisation of many classical methods—the method of moments, least squares, and maximum likelihood—as well as some recent methods like M-estimators. The basis of the method is to have, or to find, a set of simultaneous equations involving both the sample data and the unknown model parameters which are to be solved in order to define the estimates of the parameters. Various components of the equations are defined in terms of the set of observed data on which the estimates are to be based. Important examples of estimating equations are the likelihood equations. Consider the problem of estimating the rate parameter, λ of the exponential distribution which has the probability density function: Suppose that a sample of data is available from which either the sample mean, , or the sample median, m, can be calculated. Then an estimating equation based on the mean is while the estimating equation based on the median is Each of these equations is derived by equating a sample value (sample statistic) to a theoretical (population) value. In each case the sample statistic is a consistent estimator of the population value, and this provides an intuitive justification for this type of approach to estimation.
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related courses (11)
ENG-267: Estimation methods
Les étudiants traitent des observations entachées d'incertitude de manière rigoureuse. Ils maîtrisent les principales méthodes de compensation des mesures et d'estimation des paramètres. Ils appliquen
MATH-233: Probability and statistics
Le cours fournit une initiation à la théorie des probabilités et aux méthodes statistiques pour physiciens.
Show more
Related lectures (40)
Extreme Statistics: Threshold Models
Covers the theory and applications of extreme statistics, focusing on threshold models for analyzing extremes of time series.
Extreme Value Theory: Point Processes
Covers the application of extreme value theory to point processes and the estimation of extreme events from equally-spaced time series.
Linear Regression: Regularization
Covers linear regression, regularization, and probabilistic models in generating labels.
Show more
Related publications (55)

OASIS: Optimisation-based Activity Scheduling with Integrated Simultaneous choice dimensions

Michel Bierlaire, Timothy Michael Hillel, Janody Pougala

Activity-based models offer the potential of a far deeper understanding of daily mobility behaviour than trip-based models. However, activity-based models used both in research and practice have often relied on applying sequential choice models between sub ...
2023

Axial and radial axonal diffusivities and radii from single encoding strongly diffusion-weighted MRI

Erick Jorge Canales Rodriguez, Marco Pizzolato, Tim Bjørn Dyrby

We enable the estimation of the per-axon axial diffusivity from single encoding, strongly diffusion-weighted, pulsed gradient spin echo data. Additionally, we improve the estimation of the per-axon radial diffusivity compared to estimates based on spherica ...
ELSEVIER2023

Filtered data and eigenfunction estimators for statistical inference of multiscale and interacting diffusion processes

Andrea Zanoni

We study the problem of learning unknown parameters of stochastic dynamical models from data. Often, these models are high dimensional and contain several scales and complex structures. One is then interested in obtaining a reduced, coarse-grained descript ...
EPFL2022
Show more
Related concepts (1)
Maximum likelihood estimation
In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.