Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the derivation of normal equations for linear regression using mean-squared error cost function. It explains the concept of convex optimization and the application of stochastic gradient descent algorithm. The lecture also discusses the geometric interpretation of the error in relation to the columns of the input matrix. Additionally, it explores the issues of underfitting and overfitting in linear models, along with techniques like feature augmentation to address them. The importance of the Gram matrix invertibility for unique solutions in least squares is highlighted. The lecture concludes with a discussion on rank deficiency, ill-conditioning, and the impact of model complexity on fitting data.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace