Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers L1 regularization, focusing on the concept of replacing empirical risk with a regularized version. It explores how L1 regularization leads to sparse solutions by relying on a few significant entries from the observation vector. The lecture also discusses the concept of dimensionality reduction and the benefits of using elastic-net regularization. Various mathematical derivations and interpretations related to Laplacian priors and optimization problems are presented.