Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the Primal-dual optimization II focusing on the Extra-Gradient method, including SimGDA and AltGDA. It delves into nonconvex-concave problems, convergence rates, and practical performance. The lecture also discusses the Epilogue, practical implications, and the complexity of constrained min-max optimization. Various algorithms like Proximal Point, Extra-gradient, and Optimistic Gradient Descent Ascent are explored, along with their convergence properties. The lecture concludes with a discussion on the convergence of algorithms for smooth convex-concave minimax optimization and the challenges of nonsmooth, nonconvex optimization.