Lecture

Advanced Machine Learning: Boosting

Description

This lecture covers the concept of weak learners in boosting, where simple models are tailored to the problem at hand. It explains the AdaBoost algorithm, how weights are updated, and the final classification process. The drawbacks of boosting, such as overfitting and training time, are discussed. Various simple weak learners are presented, including random projections and full-covariance Gaussians. The lecture also explores different variants of boosting that aim to reduce overfitting and increase robustness to noise. Finally, it delves into improving AdaBoost by changing the error representation and testing methods, as well as cascades of weak classifiers and a celebrated example using Viola-Jones Haar-Like wavelets.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.