Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Descent Methods and Newton Method with Line Search
Graph Chatbot
Related lectures (27)
Previous
Page 2 of 3
Next
Conjugate Gradient Method: Iterative Optimization
Covers the conjugate gradient method, stopping criteria, and convergence properties in iterative optimization.
Convergence Analysis: Iterative Methods
Covers the convergence analysis of iterative methods and the conditions for convergence.
Newton Method: Convergence Analysis
Explores the Newton method for root finding and its convergence analysis, including the modified Newton method.
Lipschitz continuous Hessian and Newton's method
Explores the convergence of Newton's method and the CG algorithm for solving linear equations.
Optimization Techniques: Gradient Method Overview
Discusses the gradient method for optimization, focusing on its application in machine learning and the conditions for convergence.
Optimization Methods in Machine Learning
Explores optimization methods in machine learning, emphasizing gradients, costs, and computational efforts for efficient model training.
Coordinate Descent: Optimization Strategies
Explores coordinate descent optimization strategies, emphasizing simplicity in optimization through one-coordinate updates and discussing the implications of different approaches.
Richardson Method: Preconditioned Iterative Solvers
Covers the Richardson method for solving linear systems with preconditioned iterative solvers and introduces the gradient method.
Optimization without Constraints: Gradient Method
Covers optimization without constraints using the gradient method to find the function's minimum.
Iterative Methods for Linear Equations
Introduces iterative methods for solving linear equations and discusses the gradient method for minimizing errors.