Skip to main content
Graph
Search
fr
|
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Composite Convex Minimization
Graph Chatbot
Related lectures (24)
Previous
Page 1 of 3
Next
Gradient Descent Methods: Theory and Computation
Explores gradient descent methods for smooth convex and non-convex problems, covering iterative strategies, convergence rates, and challenges in optimization.
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.
Optimal Transport: Gradient Flows in Rd
Explores optimal transport and gradient flows in Rd, emphasizing convergence and the role of Lipschitz and Picard-Lindelöf theorems.
Proximal Operators and Constrained Optimization
Introduces proximal operators, gradient methods, and constrained optimization, exploring their convergence and practical applications.
Proximal Gradient Descent: Optimization Techniques in Machine Learning
Discusses proximal gradient descent and its applications in optimizing machine learning algorithms.
Differentiable Functions and Lagrange Multipliers
Covers differentiable functions, extreme points, and the Lagrange multiplier method for optimization.
Proximal and Subgradient Descent: Optimization Techniques
Discusses proximal and subgradient descent methods for optimization in machine learning.
Implicit Curves: Analysis & Regular Points
Covers implicit curves, regular and critical points, convexity, concavity, and inflection points.
Extrema of Functions
Covers the discussion of local extrema, concavity, convexity, and inflection points in functions.
Convergence Criteria: Necessary Conditions
Explains necessary conditions for convergence in optimization problems.