Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concepts of Maximum Likelihood and Regularization in machine learning. It starts by discussing the probabilistic interpretation of the least-squares problem, Gaussian distribution, and independence. The instructor explains how to define cost functions using log-likelihood and introduces Ridge Regression and L1 regularization. The lecture also delves into the properties of Maximum Likelihood Estimation (MLE) and the benefits of using MLE in model design. Furthermore, it explores the use of L2 and L1 regularization techniques such as Ridge Regression and Lasso to prevent overfitting in linear models.