**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Optimality and Asymptotics

Description

This lecture discusses the optimality of the Least Squares Estimator (LSE) in the Gaussian Linear Model, showing that it is the best unbiased estimator. It explores the Gauss-Markov Theorem, which states that the LSE is the best linear unbiased estimator. The concept of optimality is further examined under weaker assumptions, focusing on uncorrelatedness. The lecture also delves into the large sample distribution of the estimator, emphasizing its behavior as the sample size grows. Various theorems and proofs are presented to support the discussion, shedding light on the distribution properties of the estimator under different conditions.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

Instructors (2)

MATH-341: Linear models

Regression modelling is a fundamental tool of statistics, because it describes how the law of a random variable of interest may depend on other variables. This course aims to familiarize students with

Related concepts (56)

Estimator

In statistics, an estimator is a rule for calculating an estimate of a given quantity based on observed data: thus the rule (the estimator), the quantity of interest (the estimand) and its result (the estimate) are distinguished. For example, the sample mean is a commonly used estimator of the population mean. There are point and interval estimators. The point estimators yield single-valued results. This is in contrast to an interval estimator, where the result would be a range of plausible values.

Generalized least squares

In statistics, generalized least squares (GLS) is a method used to estimate the unknown parameters in a linear regression model when there is a certain degree of correlation between the residuals in the regression model. Least squares and weighted least squares may need to be more statistically efficient and prevent misleading inferences. GLS was first described by Alexander Aitken in 1935. In standard linear regression models one observes data on n statistical units.

Invariant estimator

In statistics, the concept of being an invariant estimator is a criterion that can be used to compare the properties of different estimators for the same quantity. It is a way of formalising the idea that an estimator should have certain intuitively appealing qualities. Strictly speaking, "invariant" would mean that the estimates themselves are unchanged when both the measurements and the parameters are transformed in a compatible way, but the meaning has been extended to allow the estimates to change in appropriate ways with such transformations.

Generalized linear model

In statistics, a generalized linear model (GLM) is a flexible generalization of ordinary linear regression. The GLM generalizes linear regression by allowing the linear model to be related to the response variable via a link function and by allowing the magnitude of the variance of each measurement to be a function of its predicted value. Generalized linear models were formulated by John Nelder and Robert Wedderburn as a way of unifying various other statistical models, including linear regression, logistic regression and Poisson regression.

Bias of an estimator

In statistics, the bias of an estimator (or bias function) is the difference between this estimator's expected value and the true value of the parameter being estimated. An estimator or decision rule with zero bias is called unbiased. In statistics, "bias" is an property of an estimator. Bias is a distinct concept from consistency: consistent estimators converge in probability to the true value of the parameter, but may be biased or unbiased; see bias versus consistency for more.

Related lectures (184)

Elements of Statistics: Probability, Distributions, and EstimationMICRO-428: Metrology

Covers probability theory, distributions, and estimation in statistics, emphasizing accuracy, precision, and resolution of measurements.

Linear Regression: Ozone Data AnalysisMATH-131: Probability and statistics

Explores linear regression analysis of ozone data using statistical models.

Probabilistic Models for Linear RegressionPHYS-467: Machine learning for physicists

Covers the probabilistic model for linear regression and its applications in nuclear magnetic resonance and X-ray imaging.

Linear Regression: Mean-square-error InferenceEE-566: Adaptation and learning

Covers the MSE problem in linear regression models, focusing on the optimal estimator and data fusion methods.

Linear Regression BasicsFIN-403: Econometrics

Covers the basics of linear regression, including OLS, heteroskedasticity, autocorrelation, instrumental variables, Maximum Likelihood Estimation, time series analysis, and practical advice.