Explores gradient descent methods for smooth convex and non-convex problems, covering iterative strategies, convergence rates, and challenges in optimization.
Explores geodesic convexity and its extension to optimization on manifolds, emphasizing the preservation of the key fact that local minima imply global minima.
Introduces Manopt, a toolbox for optimization on smooth manifolds with a Riemannian structure, covering cost functions, different types of manifolds, and optimization principles.