Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concept of Gradient Descent for Linear Mean Squared Error (MSE) in machine learning. It explains the computation of the gradient, the complexity of computing the gradient, and the variant with an offset term. The lecture also delves into Stochastic Gradient Descent, the use of penalty functions for constrained optimization, and the implementation issues such as adaptive step-size selection and feature normalization. Additionally, it discusses non-convex optimization, stopping criteria, optimality conditions, and the transformation of constrained problems into unconstrained ones.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace