**Are you an EPFL student looking for a semester project?**

Work with us on data science and visualisation projects, and deploy your project as an app on top of GraphSearch.

Publication# Methodology And Convergence Rates For Functional Time Series Regression

Abstract

The functional linear model extends the notion of linear regression to the case where the response and covariates are iid elements of an infinite-dimensional Hilbert space. The unknown to be estimated is a Hilbert-Schmidt operator, whose inverse is by definition unbounded, rendering the problem of inference ill-posed. In this paper, we consider the more general context where the sample of response/covariate pairs forms a weakly dependent stationary process in the respective product Hilbert space: simply stated, the case where we have a regression between functional time series. We consider a general framework of potentially nonlinear processes, expoiting recent advances in the spectral analysis of functional time series. This allows us to quantify the inherent ill-posedness, and to motivate a Tikhonov regularisation technique in the frequency domain. Our main result is the rate of convergence for the corresponding estimators of the regression coefficients, the latter forming a summable sequence in the space of Hilbert-Schmidt operators. In a sense, our main result can be seen as a generalisation of the classical functional linear model rates to the case of time series, and rests only upon Brillinger-type mixing conditions. It is seen that, just as the covariance operator eigenstructure plays a central role in the independent case, so does the spectral density operator's eigenstructure in the dependent case. While the analysis becomes considerably more involved in the dependent case, the rates are strikingly comparable to those of the i.i.d. case, but at the expense of an additional factor caused by the necessity to estimate the spectral density operator at a nonparametric rate, as opposed to the parametric rate for covariance operator estimation.

Official source

This page is automatically generated and may contain information that is not correct, complete, up-to-date, or relevant to your search query. The same applies to every other page on this website. Please make sure to verify the information with EPFL's official sources.

Related concepts (39)

Related MOOCs (15)

Related publications (128)

Linear regression

In statistics, linear regression is a linear approach for modelling the relationship between a scalar response and one or more explanatory variables (also known as dependent and independent variables). The case of one explanatory variable is called simple linear regression; for more than one, the process is called multiple linear regression. This term is distinct from multivariate linear regression, where multiple correlated dependent variables are predicted, rather than a single scalar variable.

Segmented regression

Segmented regression, also known as piecewise regression or broken-stick regression, is a method in regression analysis in which the independent variable is partitioned into intervals and a separate line segment is fit to each interval. Segmented regression analysis can also be performed on multivariate data by partitioning the various independent variables. Segmented regression is useful when the independent variables, clustered into different groups, exhibit different relationships between the variables in these regions.

Regression analysis

In statistical modeling, regression analysis is a set of statistical processes for estimating the relationships between a dependent variable (often called the 'outcome' or 'response' variable, or a 'label' in machine learning parlance) and one or more independent variables (often called 'predictors', 'covariates', 'explanatory variables' or 'features'). The most common form of regression analysis is linear regression, in which one finds the line (or a more complex linear combination) that most closely fits the data according to a specific mathematical criterion.

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 1)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Algebra (part 2)

Un MOOC francophone d'algèbre linéaire accessible à tous, enseigné de manière rigoureuse et ne nécessitant aucun prérequis.

Herein, machine learning (ML) models using multiple linear regression (MLR), support vector regression (SVR), random forest (RF) and artificial neural network (ANN) are developed and compared to predict the output features viz. specific capacitance (Csp), ...

Correlated errors of experimental data are a common but often neglected problem in physical sciences. Various tools are provided here for thorough propagation of uncertainties in cases of correlated errors. Discussed are techniques especially applicable to ...

We present a framework for performing regression when both covariate and response are probability distributions on a compact and convex subset of $\R^d$. Our regression model is based on the theory of optimal transport and links the conditional Fr'echet m ...