Résumé
In statistics, kernel regression is a non-parametric technique to estimate the conditional expectation of a random variable. The objective is to find a non-linear relation between a pair of random variables X and Y. In any nonparametric regression, the conditional expectation of a variable relative to a variable may be written: where is an unknown function. Nadaraya and Watson, both in 1964, proposed to estimate as a locally weighted average, using a kernel as a weighting function. The Nadaraya–Watson estimator is: where is a kernel with a bandwidth such that is of order at least 1, that is . Using the kernel density estimation for the joint distribution f(x,y) and f(x) with a kernel K, we get which is the Nadaraya–Watson estimator. where is the bandwidth (or smoothing parameter). where This example is based upon Canadian cross-section wage data consisting of a random sample taken from the 1971 Canadian Census Public Use Tapes for male individuals having common education (grade 13). There are 205 observations in total. The figure to the right shows the estimated regression function using a second order Gaussian kernel along with asymptotic variability bounds. The following commands of the R programming language use the npreg() function to deliver optimal smoothing and to create the figure given above. These commands can be entered at the command prompt via cut and paste. install.packages("np") library(np) # non parametric library data(cps71) attach(cps71) m
À propos de ce résultat
Cette page est générée automatiquement et peut contenir des informations qui ne sont pas correctes, complètes, à jour ou pertinentes par rapport à votre recherche. Il en va de même pour toutes les autres pages de ce site. Veillez à vérifier les informations auprès des sources officielles de l'EPFL.