This lecture introduces the Ordinary Least Squares (OLS) method as an algebraic tool for linear regression. It covers the derivation of the OLS estimator, the Frisch-Waugh-Lovell theorem, predicted values, residuals, matrix notation, and the properties of OLS under Gauss-Markov assumptions. The lecture also explains the concept of goodness-of-fit using the coefficient of determination (R-squared) and discusses hypothesis testing, confidence intervals, and error types in statistical inference.