Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Gradient Descent: Lipschitz Continuity
Graph Chatbot
Related lectures (26)
Previous
Page 1 of 3
Next
Optimization Techniques: Gradient Descent and Convex Functions
Provides an overview of optimization techniques, focusing on gradient descent and properties of convex functions in machine learning.
Optimization Methods
Covers optimization methods without constraints, including gradient and line search in the quadratic case.
Gradient Descent: Principles and Applications
Covers gradient descent, its principles, applications, and convergence rates in optimization for machine learning.
Proximal and Subgradient Descent: Optimization Techniques
Discusses proximal and subgradient descent methods for optimization in machine learning.
Faster Gradient Descent: Projected Optimization Techniques
Covers faster gradient descent methods and projected gradient descent for constrained optimization in machine learning.
Gradient Descent Convergence
Explains how gradient descent converges to a function's minimum at a rate of 1 over k.
Quasi-newton optimization
Covers gradient line search methods and optimization techniques with an emphasis on Wolfe conditions and positive definiteness.
Newton Method: Convergence and Quadratic Care
Covers the Newton method and its convergence properties near the optimal point.
Stochastic Gradient Descent: Non-convex Optimization Techniques
Discusses Stochastic Gradient Descent and its application in non-convex optimization, focusing on convergence rates and challenges in machine learning.
Proximal Gradient Descent: Optimization Techniques in Machine Learning
Discusses proximal gradient descent and its applications in optimizing machine learning algorithms.