Lecture

Proximal Gradient Descent: Optimization Techniques in Machine Learning

Related lectures (24)
Convex Functions
Covers the properties and operations of convex functions.
Cones of convex sets
Explores optimization on convex sets, including KKT points and tangent cones.
Convex Optimization
Introduces convex optimization, focusing on the importance of convexity in algorithms and optimization problems.
Optimal Transport: Rockafellar Theorem
Explores the Rockafellar Theorem in optimal transport, focusing on c-cyclical monotonicity and convex functions.
Weak and Strong Duality
Covers weak and strong duality in optimization problems, focusing on Lagrange multipliers and KKT conditions.
Meromorphic Functions & Differentials
Explores meromorphic functions, poles, residues, orders, divisors, and the Riemann-Roch theorem.
Optimization Problems: Path Finding and Portfolio Allocation
Covers optimization problems in path finding and portfolio allocation.
Harmonic Forms and Riemann Surfaces
Explores harmonic forms on Riemann surfaces, covering uniqueness of solutions and the Riemann bilinear identity.
Convex Optimization: Gradient Flow
Explores convex optimization, emphasizing the importance of minimizing functions within a convex set and the significance of continuous processes in studying convergence rates.
Convex Optimization
Introduces the fundamentals of convex optimization, emphasizing the significance of convex functions in simplifying the minimization process.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.