Are you an EPFL student looking for a semester project?
Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.
This lecture covers Nearest Neighbor Classifiers, focusing on the curse of dimensionality. It explains the concept of supervised machine learning, the nearest neighbor function, and its application in regression and classification. The instructor discusses the bias-variance tradeoff in k-NN, the generalization bound for 1-NN, and the implications of the curse of dimensionality on data coverage. The lecture concludes with insights on finding the optimal k value and the pros and cons of k-NN as a local averaging method.