Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Coordinate Descent: Efficient Optimization Techniques
Graph Chatbot
Related lectures (24)
Previous
Page 3 of 3
Next
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.
Newton's Method: Optimization & Indefiniteness
Covers Newton's Method for optimization and discusses the caveats of indefiniteness in optimization problems.
Optimization Techniques: Gradient Method Overview
Discusses the gradient method for optimization, focusing on its application in machine learning and the conditions for convergence.
Stochastic Optimization: Algorithms and Methods
Explores stochastic optimization algorithms and methods for convex problems with smooth and nonsmooth risks.