Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers optimization methods such as Conditional Gradient Method (CGM), Proximal Gradient, and Frank-Wolfe, focusing on convergence guarantees and trade-offs. It discusses the Conditional Gradient Method for strongly convex objectives, convergence guarantees of CGM, faster convergence rates, and examples like nuclear-norm ball and phase retrieval. The lecture also compares Proximal Gradient with Frank-Wolfe, presents a basic constrained non-convex problem, and explores phase retrieval and matrix completion problems. It delves into the role of convexity in optimization, phase retrieval as a convex matrix completion problem, and the challenges of estimation and prediction. The lecture concludes with a discussion on the time-data trade-offs, statistical dimensions, and variance reduction techniques like Stochastic Variance Reduced Gradient (SVRG).