Lecture

Decision Trees and Boosting

Description

This lecture covers decision trees as a flexible method for machine learning, illustrating them as a formalization of strategies for decision-making processes. It explains the concept of decision trees, their construction, and their application in classification and regression tasks. The lecture also introduces boosting as a technique to combine multiple predictors sequentially, focusing on AdaBoost and Gradient Boosting algorithms. It discusses the advantages and limitations of decision trees, including their ability to handle mixed data types and the issue of overfitting. The presentation includes examples, algorithms, and practical implementations using Python libraries like scikit-learn.

This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.

Watch on Mediaspace
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.