**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Linear Regression: Basics and Estimation

Description

This lecture covers the fundamentals of linear regression, focusing on the basics of unsupervised learning, dimensionality reduction, and the estimation and inference process. The instructor explains how to solve linear regression problems using least squares and matrix notation.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

Instructor

PHYS-467: Machine learning for physicists

Machine learning and data analysis are becoming increasingly central in sciences including physics. In this course, fundamental principles and methods of machine learning will be introduced and practi

Related concepts (112)

Residual sum of squares

In statistics, the residual sum of squares (RSS), also known as the sum of squared residuals (SSR) or the sum of squared estimate of errors (SSE), is the sum of the squares of residuals (deviations predicted from actual empirical values of data). It is a measure of the discrepancy between the data and an estimation model, such as a linear regression. A small RSS indicates a tight fit of the model to the data. It is used as an optimality criterion in parameter selection and model selection.

Linear least squares

Linear least squares (LLS) is the least squares approximation of linear functions to data. It is a set of formulations for solving statistical problems involved in linear regression, including variants for ordinary (unweighted), weighted, and generalized (correlated) residuals. Numerical methods for linear least squares include inverting the matrix of the normal equations and orthogonal decomposition methods. The three main linear least squares formulations are: Ordinary least squares (OLS) is the most common estimator.

Point estimation

In statistics, point estimation involves the use of sample data to calculate a single value (known as a point estimate since it identifies a point in some parameter space) which is to serve as a "best guess" or "best estimate" of an unknown population parameter (for example, the population mean). More formally, it is the application of a point estimator to the data to obtain a point estimate. Point estimation can be contrasted with interval estimation: such interval estimates are typically either confidence intervals, in the case of frequentist inference, or credible intervals, in the case of Bayesian inference.

Partition of sums of squares

The partition of sums of squares is a concept that permeates much of inferential statistics and descriptive statistics. More properly, it is the partitioning of sums of squared deviations or errors. Mathematically, the sum of squared deviations is an unscaled, or unadjusted measure of dispersion (also called variability). When scaled for the number of degrees of freedom, it estimates the variance, or spread of the observations about their mean value.

Linear regression

In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is called simple linear regression; for more than one, the process is called multiple linear regression. This term is distinct from multivariate linear regression, where multiple correlated dependent variables are predicted, rather than a single scalar variable.

Related lectures (1,000)

Generative Models: Self-Attention and TransformersPHYS-467: Machine learning for physicists

Covers generative models with a focus on self-attention and transformers, discussing sampling methods and empirical means.

Linear RegressionPHYS-467: Machine learning for physicists

Covers linear regression for estimating train speed using least squares and regularization.

Linear Regression: Statistical Inference and RegularizationPHYS-467: Machine learning for physicists

Covers the probabilistic model for linear regression and the importance of regularization techniques.

Probabilistic Models for Linear RegressionPHYS-467: Machine learning for physicists

Covers the probabilistic model for linear regression and its applications in nuclear magnetic resonance and X-ray imaging.

Probabilistic Linear RegressionPHYS-467: Machine learning for physicists

Explores probabilistic linear regression, covering joint and conditional probability, ridge regression, and overfitting mitigation.