Skip to main content
Graph
Search
fr
en
Login
Search
All
Categories
Concepts
Courses
Lectures
MOOCs
People
Practice
Publications
Startups
Units
Show all results for
Home
Lecture
Lipschitz Gradient Theorem
Graph Chatbot
Related lectures (28)
Previous
Page 1 of 3
Next
Calculus of Variations: Gradient Young Theorem
Covers the Gradient Young Theorem in the calculus of variations, discussing proofs and applications.
Optimization Methods: Convergence and Trade-offs
Covers optimization methods, convergence guarantees, trade-offs, and variance reduction techniques in numerical optimization.
Extreme Values and Constraints
Explores extreme values, constraints, Riemann's integral interpretation, and volume calculations of parallelipipeds in mathematics.
The Cauchy Theorem for Sequences
Explores the Cauchy theorem for sequences, highlighting the importance of convergence and boundedness.
Non-Convex Optimization: Techniques and Applications
Covers non-convex optimization techniques and their applications in machine learning.
Deep Learning Building Blocks
Covers tensors, loss functions, autograd, and convolutional layers in deep learning.
Local Extrema of Functions
Discusses local extrema of functions in two variables around the point (0,0).
Implicit Functions Theorem
Covers the Implicit Functions Theorem, explaining how equations can define functions implicitly.
Gradient Descent Methods: Theory and Computation
Explores gradient descent methods for smooth convex and non-convex problems, covering iterative strategies, convergence rates, and challenges in optimization.
Primal-dual Optimization: Extra-Gradient Method
Explores the Extra-Gradient method for Primal-dual optimization, covering nonconvex-concave problems, convergence rates, and practical performance.