Provides a review of linear algebra concepts crucial for convex optimization, covering topics such as vector norms, eigenvalues, and positive semidefinite matrices.
Explores gradient descent methods for smooth convex and non-convex problems, covering iterative strategies, convergence rates, and challenges in optimization.