**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of Graph Search.

Publication# Hybrid regularisation and the (in)admissibility of ridge regression in infinite dimensional Hilbert spaces

Abstract

We consider the problem of estimating the slope function in a functional regression with a scalar response and a functional covariate. This central problem of functional data analysis is well known to be ill-posed, thus requiring a regularised estimation procedure. The two most commonly used approaches are based on spectral truncation or Tikhonov regularisation of the empirical covariance operator. In principle, Tikhonov regularisation is the more canonical choice. Compared to spectral truncation, it is robust to eigenvalue ties, while it attains the optimal minimax rate of convergence in the mean squared sense, and not just in a concentration probability sense. In this paper, we show that, surprisingly, one can strictly improve upon the performance of the Tikhonov estimator in finite samples by means of a linear estimator, while retaining its stability and asymptotic properties by combining it with a form of spectral truncation. Specifically, we construct an estimator that additively decomposes the functional covariate by projecting it onto two orthogonal subspaces defined via functional PCA; it then applies Tikhonov regularisation to the one component, while leaving the other component unregularised. We prove that when the covariate is Gaussian, this hybrid estimator uniformly improves upon the MSE of the Tikhonov estimator in a non-asymptotic sense, effectively rendering it inadmissible. This domination is shown to also persist under discrete observation of the covariate function. The hybrid estimator is linear, straightforward to construct in practice, and with no computational overhead relative to the standard regularisation methods. By means of simulation, it is shown to furnish sizeable gains even for modest sample sizes.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related publications (50)

Related concepts (34)

Related MOOCs (9)

Ridge regression

Ridge regression is a method of estimating the coefficients of multiple-regression models in scenarios where the independent variables are highly correlated. It has been used in many fields including econometrics, chemistry, and engineering. Also known as Tikhonov regularization, named for Andrey Tikhonov, it is a method of regularization of ill-posed problems. It is particularly useful to mitigate the problem of multicollinearity in linear regression, which commonly occurs in models with large numbers of parameters.

Bias of an estimator

In statistics, the bias of an estimator (or bias function) is the difference between this estimator's expected value and the true value of the parameter being estimated. An estimator or decision rule with zero bias is called unbiased. In statistics, "bias" is an property of an estimator. Bias is a distinct concept from consistency: consistent estimators converge in probability to the true value of the parameter, but may be biased or unbiased; see bias versus consistency for more.

Bayes estimator

In estimation theory and decision theory, a Bayes estimator or a Bayes action is an estimator or decision rule that minimizes the posterior expected value of a loss function (i.e., the posterior expected loss). Equivalently, it maximizes the posterior expectation of a utility function. An alternative way of formulating an estimator within Bayesian statistics is maximum a posteriori estimation. Suppose an unknown parameter is known to have a prior distribution .

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 2)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Daniel Kuhn, Yves Rychener, Viet Anh Nguyen

The state-of-the-art methods for estimating high-dimensional covariance matrices all shrink the eigenvalues of the sample covariance matrix towards a data-insensitive shrinkage target. The underlying shrinkage transformation is either chosen heuristically ...

2024We propose a novel approach to evaluating the ionic Seebeck coefficient in electrolytes from relatively short equilibrium molecular dynamics simulations, based on the Green-Kubo theory of linear response and Bayesian regression analysis. By exploiting the ...

In this thesis we study stability from several viewpoints. After covering the practical importance, the rich history and the ever-growing list of manifestations of stability, we study the following. (i) (Statistical identification of stable dynamical syste ...