Explores Stochastic Gradient Descent with Averaging, comparing it with Gradient Descent, and discusses challenges in non-convex optimization and sparse recovery techniques.
Covers optimization basics, including metrics, norms, convexity, gradients, and logistic regression, with a focus on strong convexity and convergence rates.