Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the optimality of convergence rates in accelerated and stochastic gradient descent methods for non-convex optimization problems. It discusses the necessity of non-convex optimization, phase retrieval, image classification using neural networks, and the convergence rates of gradient descent algorithms. The lecture also explores the geometric interpretation of stationarity, phase retrieval for Fourier ptychography, and the formulation of binary classification problems. Various optimization formulations and examples are presented, including ridge regression and phase retrieval for Fourier ptychography. The lecture concludes with discussions on the performance of optimization algorithms, the convergence of stochastic gradient descent, and popular variants of stochastic gradient descent.