Lecture

Support Vector Machines: Soft Margin

Description

This lecture covers the concept of Support Vector Machines (SVM) with a focus on soft margin. In practice, data is often not linearly separable, leading to the need for a soft margin approach. The SVM formulation aims to find a balance between classification errors and margin size by introducing a hyperparameter C. The lecture explains the definition of support vectors, the formulation of SVM with a soft margin, the hinge loss function for binary classification, and the primal and dual formulations of SVM. It also discusses the geometric interpretation of the soft margin, the use of SVM for multiclass classification, and different approaches for multiclass classification using binary classifiers. Additionally, it introduces the concept of kernel SVM for nonlinear data separation.

This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.

Watch on Mediaspace
About this result
This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.