Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.
We consider minimizing a nonconvex, smooth function on a Riemannian manifold . We show that a perturbed version of Riemannian gradient descent algorithm converges to a second-order stationary point (and hence is able to escape saddle points on the manifold). The rate of convergence depends as on the accuracy , which matches a rate known only for unconstrained smooth minimization. The convergence rate depends polylogarithmically on the manifold dimension , hence is almost dimension-free. The rate also has a polynomial dependence on the parameters describing the curvature of the manifold and the smoothness of the function. While the unconstrained problem (Euclidean setting) is well-studied, our result is the first to prove such a rate for nonconvex, manifold-constrained problems.
, ,
Daniel Kressner, Axel Elie Joseph Séguin, Gianluca Ceruti