Results 1 
3 of
3
Adaptive Global Testing for Functional Linear Models
, 2013
"... This paper studies global testing of the slope function in functional linear regression models. A major challenge in functional global testing is to choose the dimension of projection when approximating the functional regression model by a finite dimensional multivariate linear regression model. We ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper studies global testing of the slope function in functional linear regression models. A major challenge in functional global testing is to choose the dimension of projection when approximating the functional regression model by a finite dimensional multivariate linear regression model. We develop a new method that simultaneously tests the slope vectors in a sequence of functional principal components regression models. The sequence of models being tested is determined by the sample size and is an integral part of the testing procedure. Our theoretical analysis shows that the proposed method is uniformly powerful over a class of smooth alternatives when the signal to noise ratio exceeds the detection boundary. The methods and results reflect the deep connection between the functional linear regression model and the Gaussian sequence model. We also present an extensive simulation study and a real data example to illustrate the finite sample performance of our method.
Classical Testing in Functional Linear Models
"... We extend four tests common in classical regression Wald, score, likelihood ratio and F tests to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis we reexp ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We extend four tests common in classical regression Wald, score, likelihood ratio and F tests to functional linear regression, for testing the null hypothesis, that there is no association between a scalar response and a functional covariate. Using functional principal component analysis we reexpress the functional linear model as a standard linear model, where the effect of the functional covariate can be approximated by a finite linear combination of the functional principal component scores. In this setting, we consider application of the four traditional tests. The proposed testing procedures are investigated theoretically when the number of principal components diverges, and for both densely and sparsely observed functional covariates. Using the theoretical distribution of the tests under the alternative hypothesis, we develop a procedure for sample size calculation in the context of functional linear regression. The four tests are further compared numerically in simulation experiments and using two real data applications.
Nonasymptotic Adaptive Prediction in Functional Linear Models
, 2013
"... Functional linear regression has recently attracted considerable interest. Many works focus on asymptotic inference. In this paper we consider in a non asymptotic framework a simple estimation procedure based on functional Principal Regression. It revolves in the minimization of a least square contr ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Functional linear regression has recently attracted considerable interest. Many works focus on asymptotic inference. In this paper we consider in a non asymptotic framework a simple estimation procedure based on functional Principal Regression. It revolves in the minimization of a least square contrast coupled with a classical projection on the space spanned by the m first empirical eigenvectors of the covariance operator of the functional sample. The novelty of our approach is to select automatically the crucial dimension m by minimization of a penalized least square contrast. Our method is based on model selection tools. Yet, since this kind of methods consists usually in projecting onto known nonrandom spaces, we need to adapt it to empirical eigenbasis made of datadependent – hence random – vectors. The resulting estimator is fully adaptive and is shown to verify an oracle inequality for the risk associated to the prediction error and to attain optimal minimax rates of convergence over a certain class of ellipsoids. Our strategy of model selection is finally compared numerically with crossvalidation.