Lecture

Algorithms for Composite Optimization

In course
DEMO: deserunt ullamco
Nostrud culpa do nulla duis sunt adipisicing officia proident. Cillum dolore aliquip nisi voluptate consectetur. Nisi tempor dolor magna sint irure ipsum mollit enim ea ut cupidatat adipisicing adipisicing. Ea aute mollit consequat consequat nisi voluptate est nisi.
Login to see this section
Description

This lecture covers algorithms for composite optimization, focusing on proximal operators, proximal gradient methods, and linear minimization oracles. It discusses the design of algorithms for finding solutions, including quadratic majorizers and geometric illustrations. The lecture also delves into proximal-point operators, tractable prox-operators, and solution methods for composite convex minimization. Examples and theoretical bounds are provided for proximal-gradient algorithms and fast proximal-gradient schemes, showcasing their convergence and practical performance.

Instructor
ipsum cupidatat aute
Eiusmod aute veniam incididunt cillum consequat ad quis dolore consectetur nulla quis commodo velit occaecat. Magna incididunt amet sunt in eu pariatur enim excepteur cupidatat sint sunt. Dolor ex culpa sint nostrud eiusmod amet sunt.
Login to see this section
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.
Related lectures (35)
Primal-dual Optimization: Extra-Gradient Method
Explores the Extra-Gradient method for Primal-dual optimization, covering nonconvex-concave problems, convergence rates, and practical performance.
Primal-dual Optimization: Fundamentals
Explores primal-dual optimization, minimax problems, and gradient descent-ascent methods for optimization algorithms.
Optimality of Convergence Rates: Accelerated Gradient Descent
Explores the optimality of convergence rates in convex optimization, focusing on accelerated gradient descent and adaptive methods.
Optimization Methods: Convergence and Trade-offs
Covers optimization methods, convergence guarantees, trade-offs, and variance reduction techniques in numerical optimization.
Proximal Operators: Optimization Methods
Explores proximal operators, subgradient methods, and composite minimization in optimization.
Show more

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.