Discusses optimization techniques in machine learning, focusing on stochastic gradient descent and its applications in constrained and non-convex problems.
Covers quantile regression, focusing on linear optimization for predicting outputs and discussing sensitivity to outliers, problem formulation, and practical implementation.
Covers the Branch & Bound algorithm for efficient exploration of feasible solutions and discusses LP relaxation, portfolio optimization, Nonlinear Programming, and various optimization problems.