Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers Kernel Principal Component Analysis (KPCA), a nonlinear version of PCA that uses kernels to make the problem linear. It starts with a recap of Principal Component Analysis (PCA) and its optimization. Then, it delves into the derivation of KPCA, explaining how data is sent to a feature space and the eigenvalue problem is solved using the kernel trick. The solutions to the dual eigenvalue problem are discussed, emphasizing the computational intensity of KPCA due to the eigenvalue decomposition of the Gram matrix. The lecture concludes by highlighting that KPCA offers an alternative to standard PCA by not assuming a linear transformation and utilizing the kernel trick for projection determination.