Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the role of computation in optimization, focusing on gradient descent for both convex and nonconvex problems. Starting with the mathematics of data, the instructor explains the maximum-likelihood estimator and M-estimators. The lecture then delves into unconstrained minimization, approximate vs. exact optimality, and basic iterative strategies for optimization algorithms. Descent methods, including the gradient descent algorithm, are discussed in detail, along with the challenges faced in iterative optimization algorithms. The concepts of stationarity, local minima, and global optimality are explored, emphasizing the importance of convexity in optimization algorithms.