Decision Trees and BoostingExplores decision trees in machine learning, their flexibility, impurity criteria, and introduces boosting methods like Adaboost.
AdaBoost: Decision StumpsExplores AdaBoost with decision stumps, discussing error rules, stump selection, and the need for a bias term.
Ensemble Methods: Random ForestExplores random forests as a powerful ensemble method for classification, discussing bagging, stacking, boosting, and sampling strategies.
Boosting: Adaboost AlgorithmCovers boosting with a focus on the Adaboost algorithm, forward stagewise additive modeling, and gradient tree boosting.
Ensemble Methods: Random ForestsCovers ensemble methods like random forests and Gaussian Naive Bayes, explaining how they improve prediction accuracy and estimate conditional Gaussian distributions.
Nonlinear Supervised LearningExplores the inductive bias of different nonlinear supervised learning methods and the challenges of hyper-parameter tuning.
Decision Trees and BoostingIntroduces decision trees as a method for machine learning and explains boosting techniques for combining predictors.