Covers ensemble methods like random forests and Gaussian Naive Bayes, explaining how they improve prediction accuracy and estimate conditional Gaussian distributions.
Introduces decision trees for classification, covering entropy, split quality, Gini index, advantages, disadvantages, and the random forest classifier.
Explores gradient descent methods for training artificial neural networks, covering supervised learning, single-layer networks, and modern gradient descent rules.