Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers Proximal and Subgradient Descent in the context of Optimization for Machine Learning. It explains the Proximal Gradient Descent algorithm, composite optimization problems, and the convergence properties of these methods. The lecture also delves into subgradients, convexity, Lipschitz functions, and the optimality of first-order methods.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace