Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Riemannian Gradient Descent: Convergence Theorem and Line Search Method
Graph Chatbot
Related lectures (32)
Previous
Page 2 of 4
Next
Riemannian metrics and gradients: Why and definition of Riemannian manifolds
Covers Riemannian metrics, gradients, vector fields, and inner products on manifolds.
Geodesically Convex Optimization
Covers geodesically convex optimization on Riemannian manifolds, exploring convexity properties and minimization relationships.
Optimization on Manifolds
Covers optimization on manifolds, focusing on smooth manifolds and functions, and the process of gradient descent.
Computing the Newton Step: Matrix-Based Approaches
Explores matrix-based approaches for computing the Newton step on a Riemannian manifold.
Hands on with Manopt: Optimization on Manifolds
Introduces Manopt, a toolbox for optimization on smooth manifolds with a Riemannian structure, covering cost functions, different types of manifolds, and optimization principles.
Riemannian distance, geodesically convex sets
Covers the structure of Riemannian manifolds, geodesic convexity, and the Riemannian distance function.
Riemannian Hessians: Definition and Example
Covers the definition and computation of Riemannian Hessians on manifolds.
Taylor expansions: second order
Explores Taylor expansions and retractions on Riemannian manifolds, emphasizing second-order approximations and covariant derivatives.
Riemannian Hessians: Connections and Symmetry
Covers connections on manifolds, symmetric connections, Lie brackets, and compatibility with the metric in Riemannian geometry.
Newton's method on Riemannian manifolds
Covers Newton's method on Riemannian manifolds, focusing on second-order optimality conditions and quadratic convergence.