Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Richardson Method: Preconditioned Iterative Solvers
Graph Chatbot
Related lectures (29)
Previous
Page 3 of 3
Next
Conjugate Gradient Methods: Overview
Provides an overview of conjugate gradient methods, including preconditioning, nonlinear conjugate gradient, and singular value decomposition.
Energy Equilibrium and Newton CG Method
Covers continuum mechanics, linear elasticity, force balance, divergence, finite element discretization, energy minimization, and Newton's method.
Conjugate Gradient Method
Covers the Conjugate Gradient method for solving linear systems efficiently.
Regression & Systemed Lineaires
Covers the principles of regression and linear systems, focusing on iterative methods.
Iterative Solvers: Theory and Comparison
Explores solving linear systems iteratively and compares different solvers based on worst-case assumptions and convergence measures.
Gradient Descent: Linear Regression
Covers the concept of gradient descent for linear regression, explaining the iterative process of updating parameters.
Iterative Methods for Linear Equations
Explores iterative methods for linear equations, including Jacobi and Gauss-Seidel methods, convergence criteria, and the conjugate gradient method.
Mathematics of Data: Computation Role
Explores the role of computation in data mathematics, focusing on iterative methods, optimization, estimators, and descent principles.
Quasi-Newton Methods
Introduces Quasi-Newton methods for optimization, explaining their advantages over traditional approaches like Gradient Descent and Newton's Method.