Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concepts of Kullback-Leibler divergence, regularization, and Bayesian statistics. It explains how these techniques are used to combat overfitting in machine learning models, with a focus on the Bayesian view of assuming randomness in the data. Examples of logistic regression and probability calculations are provided.