Lecture

Linear Regression: Normal Equations

Description

This lecture covers the derivation of normal equations for linear regression using mean-squared error cost function. It explains the concept of convex optimization and the application of stochastic gradient descent algorithm. The lecture also discusses the geometric interpretation of the error in relation to the columns of the input matrix. Additionally, it explores the issues of underfitting and overfitting in linear models, along with techniques like feature augmentation to address them. The importance of the Gram matrix invertibility for unique solutions in least squares is highlighted. The lecture concludes with a discussion on rank deficiency, ill-conditioning, and the impact of model complexity on fitting data.

This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.

Watch on Mediaspace
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.