Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Linear Systems: Iterative Methods
Graph Chatbot
Related lectures (29)
Previous
Page 3 of 3
Next
Convergence Criteria: Necessary Conditions
Explains necessary conditions for convergence in optimization problems.
Direct and Iterative Methods for Linear Equations
Explores direct and iterative methods for solving linear equations, emphasizing symmetric matrices and computational cost.
Preconditioned Richardson Method
Covers the Preconditioned Richardson Method for solving linear systems and the impact of preconditioning on convergence.
Newton's Method: Optimization Techniques
Explores optimization techniques like gradient descent, line search, and Newton's method for efficient problem-solving.
TR global convergence (end) + CG
Covers the trust-region method and introduces the truncated conjugate gradients method.
Optimization Methods in Machine Learning
Explores optimization methods in machine learning, emphasizing gradients, costs, and computational efforts for efficient model training.
Iterative Solvers: Theory and Comparison
Explores solving linear systems iteratively and compares different solvers based on worst-case assumptions and convergence measures.
RTR practical aspects + tCG
Explores practical aspects of Riemannian trust-region optimization and introduces the truncated conjugate gradient method.
Descent methods and line search: Preconditioned steepest descent
Introduces preconditioning in optimization problems and explains steepest descent iteration.