Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concept of Support Vector Machines (SVM) with a focus on soft margin. In practice, data is often not linearly separable, leading to the need for a soft margin approach. The SVM formulation aims to find a balance between classification errors and margin size by introducing a hyperparameter C. The lecture explains the definition of support vectors, the formulation of SVM with a soft margin, the hinge loss function for binary classification, and the primal and dual formulations of SVM. It also discusses the geometric interpretation of the soft margin, the use of SVM for multiclass classification, and different approaches for multiclass classification using binary classifiers. Additionally, it introduces the concept of kernel SVM for nonlinear data separation.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace