**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Maximum Likelihood and Regularization

Description

This lecture covers the concepts of Maximum Likelihood and Regularization in machine learning. It starts by discussing the probabilistic interpretation of the least-squares problem, Gaussian distribution, and independence. The instructor explains how to define cost functions using log-likelihood and introduces Ridge Regression and L1 regularization. The lecture also delves into the properties of Maximum Likelihood Estimation (MLE) and the benefits of using MLE in model design. Furthermore, it explores the use of L2 and L1 regularization techniques such as Ridge Regression and Lasso to prevent overfitting in linear models.

Login to watch the video

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

CS-433: Machine learning

Machine learning methods are becoming increasingly central in many sciences and applications. In this course, fundamental principles and methods of machine learning will be introduced, analyzed and pr

Instructors (2)

Related concepts (185)

Ridge regression

Ridge regression is a method of estimating the coefficients of multiple-regression models in scenarios where the independent variables are highly correlated. It has been used in many fields including econometrics, chemistry, and engineering. Also known as Tikhonov regularization, named for Andrey Tikhonov, it is a method of regularization of ill-posed problems. It is particularly useful to mitigate the problem of multicollinearity in linear regression, which commonly occurs in models with large numbers of parameters.

Regularization (mathematics)

In mathematics, statistics, finance, computer science, particularly in machine learning and inverse problems, regularization is a process that changes the result answer to be "simpler". It is often used to obtain results for ill-posed problems or to prevent overfitting. Although regularization procedures can be divided in many ways, the following delineation is particularly helpful: Explicit regularization is regularization whenever one explicitly adds a term to the optimization problem.

Linear regression

In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is called simple linear regression; for more than one, the process is called multiple linear regression. This term is distinct from multivariate linear regression, where multiple correlated dependent variables are predicted, rather than a single scalar variable.

Regularized least squares

Regularized least squares (RLS) is a family of methods for solving the least-squares problem while using regularization to further constrain the resulting solution. RLS is used for two main reasons. The first comes up when the number of variables in the linear system exceeds the number of observations. In such settings, the ordinary least-squares problem is ill-posed and is therefore impossible to fit because the associated optimization problem has infinitely many solutions.

Least squares

The method of least squares is a standard approach in regression analysis to approximate the solution of overdetermined systems (sets of equations in which there are more equations than unknowns) by minimizing the sum of the squares of the residuals (a residual being the difference between an observed value and the fitted value provided by a model) made in the results of each individual equation. The most important application is in data fitting.

Related lectures (354)

Probabilistic Interpretation: Maximum Likelihood

Explores an alternative route to the least-squares problem from a probabilistic perspective, discussing Gaussian distribution, maximum likelihood, and regularization techniques.

Linear Regression and Logistic Regression

Covers linear and logistic regression for regression and classification tasks, focusing on loss functions and model training.

Probabilistic Models for Linear Regression

Covers the probabilistic model for linear regression and its applications in nuclear magnetic resonance and X-ray imaging.

Regularization in Machine Learning

Explores Ridge and Lasso Regression for regularization in machine learning models, emphasizing hyperparameter tuning and visualization of parameter coefficients.

Logistic Regression: Vegetation Prediction

Explores logistic regression for predicting vegetation proportions in the Amazon region through remote sensing data analysis.