Covers optimization basics, including metrics, norms, convexity, gradients, and logistic regression, with a focus on strong convexity and convergence rates.
Discusses Stochastic Gradient Descent and its application in non-convex optimization, focusing on convergence rates and challenges in machine learning.