Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concept of choosing a step size in optimization on manifolds, focusing on making a Lipschitz-like assumption about the objective function and the retraction operator. It discusses the backtracking line-search method and the conditions for sufficient decrease, emphasizing the challenge of determining the Lipschitz constant. The instructor presents the Armijo backtracking for gradient descent and provides an algorithm for selecting the step size. The lecture concludes with a theorem relating the assumptions made to ensure sufficient decrease in the optimization process.