This lecture covers supervised learning focusing on regression methods. It starts with linear regression, model fitting, and numerical issues. Then, it delves into model selection, performance evaluation, and regularization techniques like ridge regression and Lasso. The lecture also discusses subset selection, cross-validation, and the bias-variance trade-off. Additionally, it explores regression trees, random forests, boosting, and their applications in predictive modeling.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace