Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture explores the concept of the bias-variance tradeoff in machine learning, focusing on the role of hyperparameters in controlling the complexity of the model class. It delves into the impact of model complexity on linear regression using polynomial feature expansion, discussing how to determine the optimal degree of complexity. The lecture also covers the decomposition of prediction errors into bias and variance components, emphasizing the importance of finding a balance to minimize the true error. Additionally, it addresses the challenges of underfitting and overfitting, highlighting the tradeoff between bias and variance in model performance.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace