Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the basics of linear regression, starting with the Ordinary Least Squares (OLS) approach to minimize squared approximation errors. The instructor explains the OLS vector, residual and predicted values, hat matrix, and residual maker matrix. The lecture also delves into the Fresh-Vaux-Laval theorem, decomposition of multiple regression, and goodness of fit using the coefficient of determination. Additionally, the Gauss-Markov assumptions are introduced, highlighting the properties of OLS under these ideal conditions.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace