This lecture covers the Gradient Descent method, starting with the determination of a descent direction and the update rule. It explains smoothness in functions and the concept of b-smoothness. The lecture also discusses the intuition behind optimization steps and provides proofs related to convexity and convergence.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace