Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the application of convex optimization in nonlinear dimensionality reduction, focusing on techniques such as the kernel trick and the spread of data points in high-dimensional spaces. The instructor explains the concept of the unfolding problem and its equivalence to an SDP, providing insights into recovering low-dimensional solutions from the SDP solution. Various examples, including piecewise constant and linear fitting, illustrate the practical implications of convex-cardinality problems in signal processing and regression tasks. The lecture concludes with discussions on exact solutions, -norm heuristics, and the interpretation of convex relaxation in addressing cardinality constraints.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace