This lecture covers the fundamentals of kernel regression, including the concept of kernels as alternative scalar products, the use of kernel regression for prediction, and the Representer Theorem. It also explores the curse of dimensionality in neural networks and the application of random features.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace