Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concept of cross-validation, including k-fold cross-validation and leave-one-out methods. It explains how cross-validation helps in model selection and hyper-parameter tuning. The lecture also discusses overfitting with linear models, regularization techniques, and their application in linear regression and logistic regression. Additionally, it explores multi-output ridge regression, kernel ridge regression, and the incorporation of regularization in support vector machines. Practical examples and exercises are provided to reinforce the theoretical concepts.