Lecture

Optimization Methods: Convergence and Trade-offs

Description

This lecture covers optimization methods such as Conditional Gradient Method (CGM), Proximal Gradient, and Frank-Wolfe, focusing on convergence guarantees and trade-offs. It discusses the Conditional Gradient Method for strongly convex objectives, convergence guarantees of CGM, faster convergence rates, and examples like nuclear-norm ball and phase retrieval. The lecture also compares Proximal Gradient with Frank-Wolfe, presents a basic constrained non-convex problem, and explores phase retrieval and matrix completion problems. It delves into the role of convexity in optimization, phase retrieval as a convex matrix completion problem, and the challenges of estimation and prediction. The lecture concludes with a discussion on the time-data trade-offs, statistical dimensions, and variance reduction techniques like Stochastic Variance Reduced Gradient (SVRG).

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Graph Chatbot

Chat with Graph Search

Ask any question about EPFL courses, lectures, exercises, research, news, etc. or try the example questions below.

DISCLAIMER: The Graph Chatbot is not programmed to provide explicit or categorical answers to your questions. Rather, it transforms your questions into API requests that are distributed across the various IT services officially administered by EPFL. Its purpose is solely to collect and recommend relevant references to content that you can explore to help you answer your questions.