Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Optimality of Convergence Rates: Accelerated Gradient Descent
Graph Chatbot
Related lectures (27)
Previous
Page 3 of 3
Next
Optimal Transport: Gradient Flows in Rd
Explores optimal transport and gradient flows in Rd, emphasizing convergence and the role of Lipschitz and Picard-Lindelöf theorems.
Gradient Descent: Linear Regression
Covers the concept of gradient descent for linear regression, explaining the iterative process of updating parameters.
Optimization Techniques: Gradient Descent and Convex Functions
Provides an overview of optimization techniques, focusing on gradient descent and properties of convex functions in machine learning.
Optimization without Constraints: Gradient Method
Covers optimization without constraints using the gradient method to find the function's minimum.
Convex Optimization: Gradient Algorithms
Covers convex optimization problems and gradient-based algorithms to find the global minimum.
Newton's Method: Optimization & Indefiniteness
Covers Newton's Method for optimization and discusses the caveats of indefiniteness in optimization problems.
Information Theory: Channel Capacity and Convex Functions
Explores channel capacity and convex functions in information theory, emphasizing the importance of convexity.