Lecture

Coordinate Descent: Optimization Strategies

Description

This lecture delves into the concept of coordinate descent optimization strategies, focusing on simplifying the optimization process by updating one coordinate at a time. The instructor explains the transition from gradient methods to second-order methods, emphasizing the trade-off between complexity and simplicity in optimization. Through examples and theoretical analysis, the lecture explores the benefits and limitations of coordinate-based approaches, discussing the importance of selecting the right step size and coordinate for efficient optimization. The instructor also covers the implications of different strategies, such as random coordinate descent, steepest coordinate descent, and importance sampling, highlighting their relevance in machine learning applications. Additionally, the lecture touches on the challenges posed by non-smooth functions and provides insights into overcoming them in practical scenarios.

This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.

Watch on Mediaspace
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.