This lecture introduces nonparametric regression models as a more flexible alternative to traditional setups. It covers scatterplot smoothing, exploiting smoothness assumptions, kernel smoothing, and penalized likelihood methods. The curse of dimensionality and bias-variance tradeoff are discussed, along with techniques like orthogonal series and projection pursuit regression to tackle high-dimensional issues. The backfitting algorithm for fitting additive models is also explained.