Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concept of learning the kernel function in the context of convex optimization. The instructor explains how to predict outputs based on inputs using a linear classifier, even when training samples are not linearly separable. The lecture delves into the soft-margin support vector machine problem, the use of feature maps to lift inputs to higher-dimensional spaces, and the dual problem formulation. It also explores the selection of kernel functions, such as Gaussian, polynomial, and exponential, and the process of choosing the optimal kernel through cross-validation.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace