Stochastic Composite Least-Squares Regression with Convergence Rate O(1/n)
Graph Chatbot
Chat with Graph Search
Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.
DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.
This article presents an overview of robot learning and adaptive control applications that can benefit from a joint use of Riemannian geometry and probabilistic representations. The roles of Riemannian manifolds, geodesics and parallel transport in robotic ...
We describe the first gradient methods on Riemannian manifolds to achieve accelerated rates in the non-convex case. Under Lipschitz assumptions on the Riemannian gradient and Hessian of the cost function, these methods find approximate first-order critical ...
Numerical continuation in the context of optimization can be used to mitigate convergence issues due to a poor initial guess. In this work, we extend this idea to Riemannian optimization problems, that is, the minimization of a target function on a Riemann ...
We present a strikingly simple proof that two rules are sufficient to automate gradient descent: 1) don’t increase the stepsize too fast and 2) don’t overstep the local curvature. No need for functional values, no line search, no information about the func ...
We consider minimizing a nonconvex, smooth function f on a Riemannian manifold M. We show that a perturbed version of Riemannian gradient descent algorithm converges to a second-order stationary point (and hence is able to escape saddle points on the manif ...
In this paper, we provide a simple pedagogical proof of the existence of covariant renormalizations in Euclidean perturbative quantum field theory on closed Riemannian manifolds, following the Epstein–Glaser philosophy. We rely on a local method that allow ...
We consider minimizing a nonconvex, smooth function f on a Riemannian manifold M. We show that a perturbed version of Riemannian gradient descent algorithm converges to a second-order stationary point (and hence is able to escape saddle point ...
Let M be a C-2-smooth Riemannian manifold with boundary and N a complete C-2-smooth Riemannian manifold. We show that each stationary p-harmonic mapping u: M -> N, whose image lies in a compact subset of N, is locally C-1,C-alpha for some alpha is an eleme ...
We propose an estimator for the mean of a random vector in Rd that can be computed in time O(n3.5 + n2d) for n i.i.d. samples and that has error bounds matching the sub-Gaussian case. The only assumptions we make about the data distribution are that it has ...
The purpose of this thesis is to provide an intrinsic proof of a Gauss-Bonnet-Chern formula for complete Riemannian manifolds with finitely many conical singularities and asymptotically conical ends. A geometric invariant is associated to the link of both ...