Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Gradient Descent: Proximal Operator and Step-Size Strategies
Graph Chatbot
Related lectures (29)
Previous
Page 1 of 3
Next
Untitled
Richardson Method: Preconditioned Iterative Solvers
Covers the Richardson method for solving linear systems with preconditioned iterative solvers and introduces the gradient method.
Optimization Methods in Machine Learning
Explores optimization methods in machine learning, emphasizing gradients, costs, and computational efforts for efficient model training.
Linear Systems: Convergence and Methods
Explores linear systems, convergence, and solving methods with a focus on CPU time and memory requirements.
Gradient Descent
Covers the concept of gradient descent in scalar cases, focusing on finding the minimum of a function by iteratively moving in the direction of the negative gradient.
Proximal Gradient Descent: Optimization Techniques in Machine Learning
Discusses proximal gradient descent and its applications in optimizing machine learning algorithms.
Optimization Methods
Covers optimization methods without constraints, including gradient and line search in the quadratic case.
Convergence Analysis: Stochastic Gradient Algorithms
Explores the convergence analysis of stochastic gradient algorithms under various operational modes and step-size sequences.
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.
Linear Systems: Iterative Methods
Explores linear systems and iterative methods like gradient descent and conjugate gradient for efficient solutions.