**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Lecture# Linear Regression Basics

Description

This lecture covers the fundamental concepts of linear regression, including the introduction to Ordinary Least Squares, basics of the linear regression model, and linear regression model beyond the basics. It delves into topics such as heteroskedasticity, autocorrelation, instrumental variables, and Generalized Least Squares. The lecture also discusses Maximum Likelihood Estimation theory and its applications, as well as univariate and multivariate time series analysis. Practical advice for applying econometrics in real-world scenarios is also provided.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

In course

Instructor

FIN-403: Econometrics

The course covers basic econometric models and methods that are routinely applied to obtain inference results in economic and financial applications.

Related concepts (146)

Linear regression

In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is called simple linear regression; for more than one, the process is called multiple linear regression. This term is distinct from multivariate linear regression, where multiple correlated dependent variables are predicted, rather than a single scalar variable.

Logistic regression

In statistics, the logistic model (or logit model) is a statistical model that models the probability of an event taking place by having the log-odds for the event be a linear combination of one or more independent variables. In regression analysis, logistic regression (or logit regression) is estimating the parameters of a logistic model (the coefficients in the linear combination).

Generalized least squares

In statistics, generalized least squares (GLS) is a method used to estimate the unknown parameters in a linear regression model when there is a certain degree of correlation between the residuals in the regression model. Least squares and weighted least squares may need to be more statistically efficient and prevent misleading inferences. GLS was first described by Alexander Aitken in 1935. In standard linear regression models one observes data on n statistical units.

Ordinary least squares

In statistics, ordinary least squares (OLS) is a type of linear least squares method for choosing the unknown parameters in a linear regression model (with fixed level-one effects of a linear function of a set of explanatory variables) by the principle of least squares: minimizing the sum of the squares of the differences between the observed dependent variable (values of the variable being observed) in the input dataset and the output of the (linear) function of the independent variable.

Generalized linear model

In statistics, a generalized linear model (GLM) is a flexible generalization of ordinary linear regression. The GLM generalizes linear regression by allowing the linear model to be related to the response variable via a link function and by allowing the magnitude of the variance of each measurement to be a function of its predicted value. Generalized linear models were formulated by John Nelder and Robert Wedderburn as a way of unifying various other statistical models, including linear regression, logistic regression and Poisson regression.

Related lectures (697)

Linear Regression BasicsFIN-403: Econometrics

Covers the basics of linear regression, instrumental variables, heteroskedasticity, autocorrelation, and Maximum Likelihood Estimation.

Linear Regression: Statistical Inference and RegularizationPHYS-467: Machine learning for physicists

Covers the probabilistic model for linear regression and the importance of regularization techniques.

Weighted Least Squares Estimation: IRLS AlgorithmMATH-413: Statistics for data science

Explores the IRLS algorithm for weighted least squares estimation in GLM.

Red bus/Blue bus paradox

Explores the Red bus/Blue bus paradox, nested logit models, and multivariate extreme value models in transportation.

Model Selection Criteria: AIC, BIC, CpMATH-413: Statistics for data science

Explores model selection criteria like AIC, BIC, and Cp in statistics for data science.