Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture delves into the concept of coordinate descent optimization strategies, focusing on simplifying the optimization process by updating one coordinate at a time. The instructor explains the transition from gradient methods to second-order methods, emphasizing the trade-off between complexity and simplicity in optimization. Through examples and theoretical analysis, the lecture explores the benefits and limitations of coordinate-based approaches, discussing the importance of selecting the right step size and coordinate for efficient optimization. The instructor also covers the implications of different strategies, such as random coordinate descent, steepest coordinate descent, and importance sampling, highlighting their relevance in machine learning applications. Additionally, the lecture touches on the challenges posed by non-smooth functions and provides insights into overcoming them in practical scenarios.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace