Lecture

Linear convergence with Polyak-Łojasiewicz: Mechanical proof

In course
DEMO: esse laborum magna
Id exercitation in exercitation sunt occaecat magna aliquip velit eu. Aute incididunt esse fugiat exercitation nostrud laborum mollit ex eu proident cillum minim anim officia. Exercitation commodo est dolore ullamco dolor est reprehenderit ullamco laboris pariatur. Voluptate fugiat ullamco commodo amet dolore culpa sit nisi eiusmod dolore duis est. Et exercitation non tempor nisi aliquip.
Login to see this section
Description

This lecture covers the concept of linear convergence with the Polyak-Łojasiewicz condition on a Riemannian manifold, providing a mechanical proof. It introduces the assumption and conditions required for linear convergence, along with the necessary decrease and Polyak-Łojasiewicz conditions.

In MOOC
Introduction to optimization on smooth manifolds: first order methods
Learn to optimize on smooth, nonlinear spaces: Join us to build your foundations (starting at "what is a manifold?") and confidently implement your first algorithm (Riemannian gradient descent).
Instructor
commodo cupidatat sint tempor
Magna qui non ad commodo proident ea reprehenderit reprehenderit fugiat. Id laboris Lorem et eiusmod sunt ullamco pariatur. Aliquip aliquip eu officia sint et reprehenderit et excepteur eiusmod ullamco aliqua. Quis dolore culpa do Lorem aute mollit laboris occaecat reprehenderit duis ut aliqua sit sit. Adipisicing occaecat sunt excepteur aute consequat duis qui veniam minim laboris ipsum.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Ontological neighbourhood
Related lectures (31)
RTR practical aspects + tCG
Explores practical aspects of Riemannian trust-region optimization and introduces the truncated conjugate gradient method.
Riemannian Gradient Descent: Convergence Theorem and Line Search Method
Covers the convergence theorem of Riemannian Gradient Descent and the line search method.
Trust Region Methods: Why, with an ExampleMOOC: Introduction to optimization on smooth manifolds: first order methods
Introduces trust region methods and presents an example of Max-Cut Burer-Monteiro rank 2 optimization.
Geodesically Convex Optimization
Covers geodesically convex optimization on Riemannian manifolds, exploring convexity properties and minimization relationships.
Computing the Newton Step: Matrix-Based ApproachesMOOC: Introduction to optimization on smooth manifolds: first order methods
Explores matrix-based approaches for computing the Newton step on a Riemannian manifold.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.