Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture by the instructor covers the concept of penalization in ridge regression, where a small amount of a full-rank matrix is added to the design matrix to address multicollinearity issues. The standardization of the design matrix is discussed, along with the interpretation of coefficients. The lecture delves into the ridge regression technique, which stabilizes the inversion process by adding a ridge parameter. The concept of shrinkage is explored, showcasing how it improves the stability of the regression model. The lecture also touches upon the trade-off between bias and variance in ridge regression, highlighting the importance of choosing the right amount of penalization. Various mathematical proofs and theorems related to ridge regression and shrinkage are presented.