This lecture introduces the method of least squares and QR factorization in the context of linear models. It covers topics such as finding solutions in the least squares sense, orthogonal matrices, and unique solutions in linear models. The presentation includes demonstrations and examples to illustrate the concepts, emphasizing the importance of linear independence and factorization. The lecture also explores applications of these methods in solving equations and predicting values based on experimental data, highlighting the connection to regression analysis and the least squares line. Various techniques for measuring closeness between data points and regression lines are discussed, along with the significance of residuals and the normal equations in general linear models.