This lecture continues the discussion on linear models for classification, focusing on multi-class classification techniques. It begins with a recap of binary classification, emphasizing the decision boundary concept and the limitations of least-squares classification. The instructor explains how logistic regression improves upon these limitations by using a logistic function to model probabilities. The lecture then transitions to multi-class classification, introducing one-hot encoding for class labels and discussing how to extend logistic regression to handle multiple classes using the softmax function. The empirical risk is defined in terms of cross-entropy loss, and the gradient descent method is outlined for optimizing the model parameters. The lecture also covers evaluation metrics for multi-class classifiers, including confusion matrices, precision, recall, and F1 scores. Finally, the instructor compares logistic regression and support vector machines, highlighting their respective strengths and weaknesses in various datasets, and concludes with a discussion on the need for nonlinear classifiers for more complex data distributions.