Explores Ridge and Lasso Regression for regularization in machine learning models, emphasizing hyperparameter tuning and visualization of parameter coefficients.
Covers the basics of linear regression, OLS method, predicted values, residuals, matrix notation, goodness-of-fit, hypothesis testing, and confidence intervals.
Delves into the trade-off between model flexibility and bias-variance in error decomposition, polynomial regression, KNN, and the curse of dimensionality.
Explores heteroskedasticity in econometrics, discussing its impact on standard errors, alternative estimators, testing methods, and implications for hypothesis testing.