This lecture covers Kernel Density Estimation (KDE) focusing on the bandwidth selection and the curse of dimensionality. It explains the smoothed plug-in estimator, the kernel density estimator, and the asymptotic risk of KDE. The lecture discusses the bias-variance tradeoff, optimal rates for KDE, and bandwidth selection methods like pilot estimator and least squares cross-validation. It also explores the estimation of integrated mean squared error and the leave-one-out cross-validation estimator. Additionally, it delves into the generalization of KDE to higher dimensions, emphasizing the importance of bandwidth selection. The lecture concludes with a comparison between parametric and nonparametric models, highlighting the tradeoff between flexibility and efficiency.