Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Riemannian Gradient Descent: Convergence Theorem and Line Search Method
Graph Chatbot
Related lectures (32)
Previous
Page 4 of 4
Next
Comparing Tangent Vectors: Three Reasons Why
Explores the importance of comparing tangent vectors at different points using algorithms and finite differences.
Computing the Newton Step: GD as a Matrix-Free Way
Explores matrix-based and matrix-free approaches for computing the Newton step in optimization on manifolds.