Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers Riemannian gradient descent, focusing on Taylor expansions, first-order optimality conditions, algorithm templates, line search, sufficient decrease, regularity conditions, and critical points. It also discusses the behavior of local minimizers and the concept of critical points. The instructor explains the pull back of functions through retractions, the definition of local minimizers, and the necessary conditions for a point to be a critical or stationary point. The lecture concludes with a theorem on the sequence produced by Riemannian gradient descent under specific assumptions.