Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.
This lecture covers primal-dual optimization methods, focusing on Lagrangian gradient techniques. It delves into the mathematics behind data optimization, including convex formulations, e-accurate solutions, and various primal-dual methods. The instructor explains the quadratic penalty and Lagrangian formulations, augmented dual problems, and the linearized augmented Lagrangian method. Examples such as blind image deconvolution, basis pursuit, and neural networks are used to illustrate the concepts. The lecture concludes with discussions on convergence guarantees, the augmented Lagrangian CGM, and applications like k-means clustering and scalable semidefinite programming.