Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the standard form of Semidefinite Programming (SDP) formulations, including trace-constrained SDPs and their broad interest. It also explores SDP relaxations for non-convex problems, such as the maximum-weight cut of a graph, clustering with minimal sum-of-squares, and approximating the Lipschitz constant of neural networks. Additionally, it discusses optimization strategies like CGM with quadratic penalty, Augmented Lagrangian CGM, and their convergence guarantees.