**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Linear Regression: Statistical Inference and Regularization

Description

This lecture covers the probabilistic model for linear regression, statistical inference, maximum likelihood and maximum a posteriori estimators, regularization techniques like LASSO, and the importance of variable selection in the context of machine learning.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

Instructor

PHYS-467: Machine learning for physicists

Machine learning and data analysis are becoming increasingly central in sciences including physics. In this course, fundamental principles and methods of machine learning will be introduced and practi

Related concepts (159)

Logistic regression

In statistics, the logistic model (or logit model) is a statistical model that models the probability of an event taking place by having the log-odds for the event be a linear combination of one or more independent variables. In regression analysis, logistic regression (or logit regression) is estimating the parameters of a logistic model (the coefficients in the linear combination).

Elastic net regularization

In statistics and, in particular, in the fitting of linear or logistic regression models, the elastic net is a regularized regression method that linearly combines the L1 and L2 penalties of the lasso and ridge methods. The elastic net method overcomes the limitations of the LASSO (least absolute shrinkage and selection operator) method which uses a penalty function based on Use of this penalty function has several limitations. For example, in the "large p, small n" case (high-dimensional data with few examples), the LASSO selects at most n variables before it saturates.

Robust regression

In robust statistics, robust regression seeks to overcome some limitations of traditional regression analysis. A regression analysis models the relationship between one or more independent variables and a dependent variable. Standard types of regression, such as ordinary least squares, have favourable properties if their underlying assumptions are true, but can give misleading results otherwise (i.e. are not robust to assumption violations).

Loss function

In mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost" associated with the event. An optimization problem seeks to minimize a loss function. An objective function is either a loss function or its opposite (in specific domains, variously called a reward function, a profit function, a utility function, a fitness function, etc.

Instrumental variables estimation

In statistics, econometrics, epidemiology and related disciplines, the method of instrumental variables (IV) is used to estimate causal relationships when controlled experiments are not feasible or when a treatment is not successfully delivered to every unit in a randomized experiment. Intuitively, IVs are used when an explanatory variable of interest is correlated with the error term, in which case ordinary least squares and ANOVA give biased results.

Related lectures (1,000)

Probabilistic Models for Linear RegressionPHYS-467: Machine learning for physicists

Covers the probabilistic model for linear regression and its applications in nuclear magnetic resonance and X-ray imaging.

Probabilistic Linear RegressionPHYS-467: Machine learning for physicists

Explores probabilistic linear regression, covering joint and conditional probability, ridge regression, and overfitting mitigation.

Linear Regression: Basics and EstimationPHYS-467: Machine learning for physicists

Covers the basics of linear regression and how to solve estimation problems using least squares and matrix notation.

Linear Regression: Statistical Inference PerspectivePHYS-467: Machine learning for physicists

Explores linear regression from a statistical inference perspective, covering probabilistic models, ground truth, labels, and maximum likelihood estimators.

Regression: Simple and Multiple Linear

Covers simple and multiple linear regression, including least squares estimation and model diagnostics.