Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the analysis of unconstrained optimization problems using gradient descent and accelerated gradient descent methods. The instructor presents convergence plots and discusses the impact of smoothness and strong convexity on the optimization algorithms. Various scenarios are explored, including cases where the algorithms may or may not adapt to the underlying strong convexity of the problem. The lecture also delves into the implications of different step sizes on convergence rates and the importance of correctly identifying problem characteristics. Additionally, the spatial scaling of optimization methods is examined, highlighting the performance differences as problem dimensionality increases. Through detailed analysis and comparisons, students gain insights into the behavior of optimization algorithms in different problem settings.