Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concept of weak learners in boosting, where simple models are tailored to the problem at hand. It explains the AdaBoost algorithm, how weights are updated, and the final classification process. The drawbacks of boosting, such as overfitting and training time, are discussed. Various simple weak learners are presented, including random projections and full-covariance Gaussians. The lecture also explores different variants of boosting that aim to reduce overfitting and increase robustness to noise. Finally, it delves into improving AdaBoost by changing the error representation and testing methods, as well as cascades of weak classifiers and a celebrated example using Viola-Jones Haar-Like wavelets.