This lecture covers the basics of support vector machines (SVM), including hyperplanes, multi-output prediction, non-linearity, probabilistic interpretation, logistic regression, decision boundaries, maximum margin classifier, slack variables, SVM formulation, and practical examples. It also discusses the curse of dimensionality, nearest neighbor method, linear models, histograms, and the k-Nearest Neighbors (k-NN) algorithm. The lecture concludes with exercises on implementing SVM and k-NN.