Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concept of convexifying nonconvex problems, focusing on Support Vector Machines (SVM) and nonlinear dimensionality reduction. It explains the primal and dual formulations of SVM, the kernel trick, and the use of Lagrange multipliers. Additionally, it delves into nonlinear dimensionality reduction techniques, such as constructing k-nearest neighborhood graphs and solving optimization problems to unfold high-dimensional data. The instructor also discusses the challenges of exact solutions for convex-cardinality problems and introduces the l₁-norm heuristic as an approximation method.