**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Modern Regression: Maximum Likelihood Estimation

Description

This lecture covers the maximum likelihood estimation (MLE) approach in modern regression, focusing on the log likelihood function and the profile log likelihood for estimating parameters B and o2. It discusses the MLEs of B and o2, the Newton-Raphson and EM algorithms for optimization, and inference on the coefficients. The lecture also introduces the concept of quasi-likelihood and its application in model comparison using deviance. Additionally, it explores the REML estimation method and its advantages over traditional MLE, emphasizing the importance of correct model specification and efficient estimation in large samples.

Login to watch the video

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

Instructor

Related concepts (109)

MATH-408: Regression methods

General graduate course on regression methods

Likelihood function

In statistical inference, the likelihood function quantifies the plausibility of parameter values characterizing a statistical model in light of observed data. Its most typical usage is to compare possible parameter values (under a fixed set of observations and a particular model), where higher values of likelihood are preferred because they correspond to more probable parameter values.

Estimation theory

Estimation theory is a branch of statistics that deals with estimating the values of parameters based on measured empirical data that has a random component. The parameters describe an underlying physical setting in such a way that their value affects the distribution of the measured data. An estimator attempts to approximate the unknown parameters using the measurements.

Linear algebra

Linear algebra is the branch of mathematics concerning linear equations such as: linear maps such as: and their representations in vector spaces and through matrices. Linear algebra is central to almost all areas of mathematics. For instance, linear algebra is fundamental in modern presentations of geometry, including for defining basic objects such as lines, planes and rotations. Also, functional analysis, a branch of mathematical analysis, may be viewed as the application of linear algebra to spaces of functions.

Linear function

In mathematics, the term linear function refers to two distinct but related notions: In calculus and related areas, a linear function is a function whose graph is a straight line, that is, a polynomial function of degree zero or one. For distinguishing such a linear function from the other concept, the term affine function is often used. In linear algebra, mathematical analysis, and functional analysis, a linear function is a linear map.

Maximum likelihood estimation

In statistics, maximum likelihood estimation (MLE) is a method of estimating the parameters of an assumed probability distribution, given some observed data. This is achieved by maximizing a likelihood function so that, under the assumed statistical model, the observed data is most probable. The point in the parameter space that maximizes the likelihood function is called the maximum likelihood estimate. The logic of maximum likelihood is both intuitive and flexible, and as such the method has become a dominant means of statistical inference.

Related lectures (12)

Probabilistic Models for Linear Regression

Covers the probabilistic model for linear regression and its applications in nuclear magnetic resonance and X-ray imaging.

Linear Regression: Basics and Estimation

Covers the basics of linear regression and how to solve estimation problems using least squares and matrix notation.

Linear Regression: Statistical Inference and Regularization

Covers the probabilistic model for linear regression and the importance of regularization techniques.

Generative Models: Self-Attention and Transformers

Covers generative models with a focus on self-attention and transformers, discussing sampling methods and empirical means.

Probabilistic Linear Regression

Explores probabilistic linear regression, covering joint and conditional probability, ridge regression, and overfitting mitigation.