Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers the concept of nearest neighbor classifiers in supervised machine learning, focusing on predicting labels based on the closest data points. It discusses the function of nearest neighbors, their application in regression and classification tasks, and the bias-variance tradeoff. Additionally, it explores the curse of dimensionality, explaining how high-dimensional data affects the coverage of training sets and the distance between data points. The lecture also delves into generalization bounds for 1-NN, providing proofs and insights on error bounds and geometric terms.
This video is available exclusively on Mediaspace for a restricted audience. Please log in to MediaSpace to access it if you have the necessary permissions.
Watch on Mediaspace