Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Riemannian Gradient Descent: Convergence Theorem and Line Search Method
Graph Chatbot
Related lectures (32)
Previous
Page 3 of 4
Next
Riemannian connections: What they are and why we care
Covers Riemannian connections, emphasizing their properties and significance in geometry.
Symmetry Property: Riemannian Connection in Geometry
Explores symmetries, Riemannian connection, vector fields, and Lie bracket in geometry.
Riemannian metrics and gradients: Computing gradients from extensions
Explores computing gradients on Riemannian manifolds through extensions and retractions, emphasizing orthogonal projectors and smooth extensions.
Differentiating Vector Fields: How Not to Do It
Discusses the challenges in differentiating vector fields on submanifolds and the importance of choosing the right method.
Differentiating vector fields: Why do it?
Explores the importance of differentiating vector fields and the correct methodology to achieve it, emphasizing the significance of going beyond the first order.
Newton's method: Optimization on manifolds
Explores Newton's method for optimizing functions on manifolds using second-order information and discusses its drawbacks and fixes.
Gradients on Riemannian submanifolds, local frames
Discusses gradients on Riemannian submanifolds and the construction of local frames.
All things Riemannian: metrics, (sub)manifolds and gradients
Covers the definition of retraction, open submanifolds, local defining functions, tangent spaces, and Riemannian metrics.
Choosing a Step Size
Explores choosing a step size in optimization on manifolds, including backtracking line-search and the Armijo method.
Riemannian metrics and gradients: Riemannian gradients
Explains Riemannian submanifolds, metrics, and gradients computation on manifolds.