Lecture

Other regularizations + the Lasso

Description

This lecture covers a diverse set of regularization approaches, including the L0 quasi-norm and the Lasso method. It discusses best subset selection, the trade-off between fitting and the number of variables, and the NP-hardness of the problem. The Lasso, or Least Absolute Shrinkage and Selection Operator, is introduced as a convex but non-differentiable optimization problem with efficient algorithms for solution. The lecture also explores the extension to quasi-norms and the difference between constrained and regularized Lasso regression problems.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.