Lecture

Composite Convex Minimization

Related lectures (47)
From Stochastic Gradient Descent to Non-Smooth Optimization
Covers stochastic optimization, sparsity, and non-smooth minimization via subgradient descent.
Convex Functions
Covers the properties and operations of convex functions.
Introduction to Quantum Chaos
Covers the introduction to Quantum Chaos, classical chaos, sensitivity to initial conditions, ergodicity, and Lyapunov exponents.
Algorithms for Composite Optimization
Explores algorithms for composite optimization, including proximal operators and gradient methods, with examples and theoretical bounds.
Double Descent Curves: Overparametrization
Explores double descent curves and overparametrization in machine learning models, highlighting the risks and benefits.
Taylor's Formula: Developments and Extrema
Covers Taylor's formula, developments, and extrema of functions, discussing convexity and concavity.
Exploration Bias
Explores regularization, learning algorithms, and subgaussian assumptions in machine learning.
Taylor Polynomials: Correction Exam 2018
Covers the correction of an exam from 2018, focusing on Taylor polynomials and Jacobian matrices.
Derivatives and Convexity
Explores derivatives, local extrema, and convexity in functions, including Taylor's formula and function compositions.
Primal-dual optimization: Theory and Computation
Explores primal-dual optimization, conjugation of functions, strong duality, and quadratic penalty methods in data mathematics.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.