Êtes-vous un étudiant de l'EPFL à la recherche d'un projet de semestre?
Travaillez avec nous sur des projets en science des données et en visualisation, et déployez votre projet sous forme d'application sur Graph Search.
We consider the problem of estimating the slope function in a functional regression with a scalar response and a functional covariate. This central problem of functional data analysis is well known to be ill-posed, thus requiring a regularised estimation procedure. The two most commonly used approaches are based on spectral truncation or Tikhonov regularisation of the empirical covariance operator. In principle, Tikhonov regularisation is the more canonical choice. Compared to spectral truncation, it is robust to eigenvalue ties, while it attains the optimal minimax rate of convergence in the mean squared sense, and not just in a concentration probability sense. In this paper, we show that, surprisingly, one can strictly improve upon the performance of the Tikhonov estimator in finite samples by means of a linear estimator, while retaining its stability and asymptotic properties by combining it with a form of spectral truncation. Specifically, we construct an estimator that additively decomposes the functional covariate by projecting it onto two orthogonal subspaces defined via functional PCA; it then applies Tikhonov regularisation to the one component, while leaving the other component unregularised. We prove that when the covariate is Gaussian, this hybrid estimator uniformly improves upon the MSE of the Tikhonov estimator in a non-asymptotic sense, effectively rendering it inadmissible. This domination is shown to also persist under discrete observation of the covariate function. The hybrid estimator is linear, straightforward to construct in practice, and with no computational overhead relative to the standard regularisation methods. By means of simulation, it is shown to furnish sizeable gains even for modest sample sizes.
Daniel Kuhn, Yves Rychener, Viet Anh Nguyen