Lecture

Primal-dual Optimization: Extra-Gradient Method

Description

This lecture covers the Primal-dual optimization II focusing on the Extra-Gradient method, including SimGDA and AltGDA. It delves into nonconvex-concave problems, convergence rates, and practical performance. The lecture also discusses the Epilogue, practical implications, and the complexity of constrained min-max optimization. Various algorithms like Proximal Point, Extra-gradient, and Optimistic Gradient Descent Ascent are explored, along with their convergence properties. The lecture concludes with a discussion on the convergence of algorithms for smooth convex-concave minimax optimization and the challenges of nonsmooth, nonconvex optimization.

About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.