**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Choosing a Step Size

Description

This lecture covers the concept of choosing a step size in optimization on manifolds, focusing on making a Lipschitz-like assumption about the objective function and the retraction operator. It discusses the backtracking line-search method and the conditions for sufficient decrease, emphasizing the challenge of determining the Lipschitz constant. The instructor presents the Armijo backtracking for gradient descent and provides an algorithm for selecting the step size. The lecture concludes with a theorem relating the assumptions made to ensure sufficient decrease in the optimization process.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

In MOOC

Instructor

Related concepts (34)

MATH-512: Optimization on manifolds

We develop, analyze and implement numerical algorithms to solve optimization problems of the form: min f(x) where x is a point on a smooth manifold. To this end, we first study differential and Rieman

Introduction to optimization on smooth manifolds: first order methods

Learn to optimize on smooth, nonlinear spaces: Join us to build your foundations (starting at "what is a manifold?") and confidently implement your first algorithm (Riemannian gradient descent).

Manifold

In mathematics, a manifold is a topological space that locally resembles Euclidean space near each point. More precisely, an -dimensional manifold, or -manifold for short, is a topological space with the property that each point has a neighborhood that is homeomorphic to an open subset of -dimensional Euclidean space. One-dimensional manifolds include lines and circles, but not lemniscates. Two-dimensional manifolds are also called surfaces. Examples include the plane, the sphere, and the torus, and also the Klein bottle and real projective plane.

Multi-objective optimization

Multi-objective optimization or Pareto optimization (also known as multi-objective programming, vector optimization, multicriteria optimization, or multiattribute optimization) is an area of multiple-criteria decision making that is concerned with mathematical optimization problems involving more than one objective function to be optimized simultaneously. Multi-objective is a type of vector optimization that has been applied in many fields of science, including engineering, economics and logistics where optimal decisions need to be taken in the presence of trade-offs between two or more conflicting objectives.

Differentiable manifold

In mathematics, a differentiable manifold (also differential manifold) is a type of manifold that is locally similar enough to a vector space to allow one to apply calculus. Any manifold can be described by a collection of charts (atlas). One may then apply ideas from calculus while working within the individual charts, since each chart lies within a vector space to which the usual rules of calculus apply. If the charts are suitably compatible (namely, the transition from one chart to another is differentiable), then computations done in one chart are valid in any other differentiable chart.

Gradient descent

In mathematics, gradient descent (also often called steepest descent) is a iterative optimization algorithm for finding a local minimum of a differentiable function. The idea is to take repeated steps in the opposite direction of the gradient (or approximate gradient) of the function at the current point, because this is the direction of steepest descent. Conversely, stepping in the direction of the gradient will lead to a local maximum of that function; the procedure is then known as gradient ascent.

Eight queens puzzle

The eight queens puzzle is the problem of placing eight chess queens on an 8×8 chessboard so that no two queens threaten each other; thus, a solution requires that no two queens share the same row, column, or diagonal. There are 92 solutions. The problem was first posed in the mid-19th century. In the modern era, it is often used as an example problem for various computer programming techniques. The eight queens puzzle is a special case of the more general n queens problem of placing n non-attacking queens on an n×n chessboard.

Related lectures (59)

Computing the Newton Step: GD as a Matrix-Free WayMATH-512: Optimization on manifoldsMOOC: Introduction to optimization on smooth manifolds: first order methods

Explores matrix-based and matrix-free approaches for computing the Newton step in optimization on manifolds.

Comparing Tangent Vectors: Three Reasons WhyMATH-512: Optimization on manifoldsMOOC: Introduction to optimization on smooth manifolds: first order methods

Explores the importance of comparing tangent vectors at different points using algorithms and finite differences.

Transporters: a proxy for parallel transportMATH-512: Optimization on manifoldsMOOC: Introduction to optimization on smooth manifolds: first order methods

Explores transporters as a practical alternative to parallel transport, discussing minimal requirements, examples with matrices, pragmatic choices, and optimization algorithms.

Computing the Newton Step: From GD to CGMATH-512: Optimization on manifoldsMOOC: Introduction to optimization on smooth manifolds: first order methods

Covers the transition from Gradient Descent to Conjugate Gradients, highlighting the efficiency of CG over GD in optimization on manifolds.

RTR practical aspects + tCGMATH-512: Optimization on manifolds

Explores practical aspects of Riemannian trust-region optimization and introduces the truncated conjugate gradient method.