This lecture covers linear models for classification, starting with a recap on simple parametric models and hyperplanes. It then delves into linear regression, binary classification, adding non-linearity, and probabilistic interpretation. The lecture also discusses logistic regression training, evaluating classifiers, classification metrics, ROC curves, decision boundaries, margin, support vectors, and the maximum margin classifier. It concludes with an overview of SVM, constrained optimization, Lagrange duality, and the practical use of SVMs in data science methods.