Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concepts of feature expansion and kernels in machine learning. It starts with a recap of linear models, margin, and maximum margin classifier. Then, it delves into support vector machines (SVM) formulation, overlapping classes, and slack variables. The lecture explains the differences between least-square classification, logistic regression, and SVM. It introduces the concept of polynomial feature expansion and its application in nonlinear classification. The lecture also discusses the curse of dimensionality, nearest neighbor methods, and k-nearest neighbors for classification and regression. It concludes with kernel functions, kernel trick, kernel regression, and kernel SVM for classification.