Explores optimization methods like gradient descent and subgradients for training machine learning models, including advanced techniques like Adam optimization.
Introduces kernel methods like SVM and regression, covering concepts such as margin, support vector machine, curse of dimensionality, and Gaussian process regression.
Explores logistic regression fundamentals, including cost functions, regularization, and classification boundaries, with practical examples using scikit-learn.