Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Iterative Solvers: Theory and Comparison
Graph Chatbot
Related lectures (30)
Previous
Page 2 of 3
Next
Conjugate Gradient Method: Iterative Optimization
Covers the conjugate gradient method, stopping criteria, and convergence properties in iterative optimization.
Preconditioned Richardson Method
Covers the Preconditioned Richardson Method for solving linear systems and the impact of preconditioning on convergence.
Convergence Analysis: Iterative Methods
Covers the convergence analysis of iterative methods and the conditions for convergence.
Construction of an Iterative Method
Covers the construction of an iterative method for linear systems, emphasizing matrix decomposition and convexity.
Eigenvalues and Eigenvectors
Explores eigenvalues, eigenvectors, and methods for solving linear systems with a focus on rounding errors and preconditioning matrices.
The Conjugate Gradients Method (CG)
Covers the Conjugate Gradients method for solving linear systems iteratively with quadratic convergence and emphasizes the importance of linear independence among conjugate directions.
Construction of an Iterative Method
Covers the construction of an iterative method for linear systems by decomposing a matrix A into P, T, and P_A.
Linear Systems: Direct Methods
Covers the formulation of linear systems, direct and iterative methods for solving them, and the cost of LU factorization.
Richardson Method: Preconditioned Iterative Solvers
Covers the Richardson method for solving linear systems with preconditioned iterative solvers and introduces the gradient method.
Iterative Methods for Linear Equations
Explores iterative methods for linear equations, including Jacobi and Gauss-Seidel methods, convergence criteria, and the conjugate gradient method.