Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Richardson Method: Preconditioned Iterative Solvers
Graph Chatbot
Related lectures (29)
Previous
Page 1 of 3
Next
Conjugate Gradient Method: Iterative Optimization
Covers the conjugate gradient method, stopping criteria, and convergence properties in iterative optimization.
Linear Systems: Convergence and Methods
Explores linear systems, convergence, and solving methods with a focus on CPU time and memory requirements.
Numerical Analysis: Linear Systems
Covers the analysis of linear systems, focusing on methods such as Jacobi and Richardson for solving linear equations.
Jacobi and Gauss-Seidel methods
Explains the Jacobi and Gauss-Seidel methods for solving linear systems iteratively.
Linear Systems: Iterative Methods
Explores linear systems and iterative methods like gradient descent and conjugate gradient for efficient solutions.
Iterative Methods for Linear Equations
Covers iterative methods for solving linear equations and analyzing convergence, including error control and positive definite matrices.
Iterative Methods: Linear Systems
Covers iterative methods for solving linear systems and discusses convergence criteria and spectral radius.
Gradient Descent
Covers the concept of gradient descent in scalar cases, focusing on finding the minimum of a function by iteratively moving in the direction of the negative gradient.
Convergence Analysis: Iterative Methods
Covers the convergence analysis of iterative methods and the conditions for convergence.
Optimization without Constraints: Gradient Method
Covers optimization without constraints using the gradient method to find the function's minimum.