Lecture

Advanced Machine Learning: Boosting

Description

This lecture covers the concept of weak learners in boosting, where simple models are tailored to the problem at hand. It explains the AdaBoost algorithm, how weights are updated, and the final classification process. The drawbacks of boosting, such as overfitting and training time, are discussed. Various simple weak learners are presented, including random projections and full-covariance Gaussians. The lecture also explores different variants of boosting that aim to reduce overfitting and increase robustness to noise. Finally, it delves into improving AdaBoost by changing the error representation and testing methods, as well as cascades of weak classifiers and a celebrated example using Viola-Jones Haar-Like wavelets.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.