Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Descent Methods and Newton Method with Line Search
Graph Chatbot
Related lectures (27)
Previous
Page 1 of 3
Next
Linear Systems: Iterative Methods
Explores linear systems and iterative methods like gradient descent and conjugate gradient for efficient solutions.
Iterative Methods for Linear Equations
Covers iterative methods for solving linear equations and analyzing convergence, including error control and positive definite matrices.
Gradient Descent
Covers the concept of gradient descent in scalar cases, focusing on finding the minimum of a function by iteratively moving in the direction of the negative gradient.
Quasi-Newton Methods
Introduces Quasi-Newton methods for optimization, explaining their advantages over traditional approaches like Gradient Descent and Newton's Method.
Newton's Method: Optimization & Indefiniteness
Covers Newton's Method for optimization and discusses the caveats of indefiniteness in optimization problems.
Coordinate Descent: Efficient Optimization Techniques
Covers coordinate descent, a method for optimizing functions by updating one coordinate at a time.
Newton's Method: Optimization Techniques
Explores optimization techniques like gradient descent, line search, and Newton's method for efficient problem-solving.
Newton Method: Convergence and Quadratic Care
Covers the Newton method and its convergence properties near the optimal point.
Linear Systems: Convergence and Methods
Explores linear systems, convergence, and solving methods with a focus on CPU time and memory requirements.
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.