Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Quasi-Newton Methods
Graph Chatbot
Related lectures (26)
Previous
Page 2 of 3
Next
Gradient Descent: Optimization
Explains gradient descent for optimization and how to find the direction towards the solution by minimizing distances.
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.
Root Finding Methods: Secant and Newton's Methods
Covers numerical methods for root finding, focusing on the secant and Newton's methods.
Convex Optimization: Gradient Algorithms
Covers convex optimization problems and gradient-based algorithms to find the global minimum.
Optimality Conditions: Unconstrained
Covers Fermat's theorem, necessary optimality conditions, convexity, and eigenvalue curvature in optimization.
Newton's Method: Optimization & Indefiniteness
Covers Newton's Method for optimization and discusses the caveats of indefiniteness in optimization problems.
Nonlinear Optimization
Covers line search, Newton's method, BFGS, and conjugate gradient in nonlinear optimization.
Gradient Descent: Principles and Applications
Covers gradient descent, its principles, applications, and convergence rates in optimization for machine learning.
Root Finding Methods: Bisection and Secant Techniques
Covers root-finding methods, focusing on the bisection and secant techniques, their implementations, and comparisons of their convergence rates.
Gradient Descent Methods: Theory and Computation
Explores gradient descent methods for smooth convex and non-convex problems, covering iterative strategies, convergence rates, and challenges in optimization.