This lecture provides a recap on linear models, covering parametric models like lines and hyperplanes, as well as algorithms such as least-square classification, logistic regression, and SVM. It then delves into multi-class classification, discussing one-hot encodings, multi-output linear models, and multi-class least-square classification. The lecture also explores k-Nearest Neighbors (k-NN) method for classification and regression, discussing properties, examples, and the curse of dimensionality. It concludes with feature expansion techniques like polynomial curve fitting and the use of nonlinear functions, emphasizing the importance of data representation in high-dimensional spaces.