Lecture

Trust Region Methods: Why, with an Example

In course
DEMO: incididunt dolor
Pariatur nulla deserunt elit quis minim consequat duis minim cillum enim dolor veniam dolore eiusmod. Ut sit proident irure cupidatat ex amet deserunt sunt sunt. Non ad proident reprehenderit qui ipsum quis laboris fugiat consectetur. Ullamco mollit sunt Lorem est ipsum veniam Lorem sint mollit labore labore. Mollit voluptate deserunt in ad eu dolore sint non ex.
Login to see this section
Description

This lecture introduces trust region methods, explaining their historical development from Levenberg to modern Riemannian manifolds. The instructor discusses the transition from gradient descent to Newton's method and presents an example of Max-Cut Burer-Monteiro rank 2 optimization.

In MOOC
Introduction to optimization on smooth manifolds: first order methods
Learn to optimize on smooth, nonlinear spaces: Join us to build your foundations (starting at "what is a manifold?") and confidently implement your first algorithm (Riemannian gradient descent).
Instructor
velit elit
Dolor officia esse ad amet anim velit et nostrud commodo. Consectetur consequat cupidatat labore consequat irure ad ullamco cupidatat culpa pariatur elit. Fugiat sunt quis aute nisi sint sit duis.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (33)
RTR practical aspects + tCG
Explores practical aspects of Riemannian trust-region optimization and introduces the truncated conjugate gradient method.
Riemannian Gradient Descent: Convergence Theorem and Line Search Method
Covers the convergence theorem of Riemannian Gradient Descent and the line search method.
Geodesic Convexity: Basic Facts and DefinitionsMOOC: Introduction to optimization on smooth manifolds: first order methods
Explores geodesic convexity, focusing on properties of convex functions on manifolds.
Computing the Newton Step: Matrix-Based ApproachesMOOC: Introduction to optimization on smooth manifolds: first order methods
Explores matrix-based approaches for computing the Newton step on a Riemannian manifold.
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.