Explores optimization methods like gradient descent and subgradients for training machine learning models, including advanced techniques like Adam optimization.
Explores Ridge and Lasso Regression for regularization in machine learning models, emphasizing hyperparameter tuning and visualization of parameter coefficients.