Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the theory and applications of Support Vector Machines (SVMs) in machine learning. It explains Vapnik's invention of a training algorithm for optimal margin classifiers, the concept of support-vector networks, linear classifiers, hard-SVM rules for max-margin separating hyperplanes, soft SVMs for non-linearly separable data, classification by risk minimization, convex relaxation of classification risk, losses for classification, and the application of convex duality to SVMs. The lecture also delves into optimization techniques for finding the optimal hyperplane, convex duality in SVMs, and the interpretation of the dual formulation in SVMs.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace