Results 1  10
of
36
Prediction in functional linear regression
, 2006
"... There has been substantial recent work on methods for estimating the slope function in linear regression for functional data analysis. However, as in the case of more conventional finitedimensional regression, much of the practical interest in the slope centers on its application for the purpose of ..."
Abstract

Cited by 72 (5 self)
 Add to MetaCart
(Show Context)
There has been substantial recent work on methods for estimating the slope function in linear regression for functional data analysis. However, as in the case of more conventional finitedimensional regression, much of the practical interest in the slope centers on its application for the purpose of prediction, rather than on its significance in its own right. We show that the problems of slopefunction estimation, and of prediction from an estimator of the slope function, have very different characteristics. While the former is intrinsically nonparametric, the latter can be either nonparametric or semiparametric. In particular, the optimal meansquare convergence rate of predictors is n −1, where n denotes sample size, if the predictand is a sufficiently smooth function. In other cases, convergence occurs at a polynomial rate that is strictly slower than n −1. At the boundary between these two regimes, the meansquare convergence rate is less than n −1 by only a logarithmic factor. More generally, the rate of convergence of the predicted value of the mean response in the regression model, given a particular value of the explanatory variable, is determined by a subtle interaction among the smoothness of the predictand, of the slope function in the model, and of the autocovariance function for the distribution of explanatory variables. 1. Introduction. In
F: Functional additive models
 J Am Stat Assoc
"... In commonly used functional regression models, the regression of a scalar or functional response on the functional predictor is assumed to be linear. This means the response is a linear function of the functional principal component scores of the predictor process. We relax the linearity assumption ..."
Abstract

Cited by 29 (8 self)
 Add to MetaCart
(Show Context)
In commonly used functional regression models, the regression of a scalar or functional response on the functional predictor is assumed to be linear. This means the response is a linear function of the functional principal component scores of the predictor process. We relax the linearity assumption and propose to replace it by an additive structure. This leads to a more widely applicable and much more flexible framework for functional regression models. The proposed functional additive regression models are suitable for both scalar and functional responses. The regularization needed for effective estimation of the regression parameter function is implemented through a projection on the eigenbasis of the covariance operator of the functional components in the model. The utilization of functional principal components in an additive rather than linear way leads to substantial broadening of the scope of functional regression models and emerges as a natural approach, as the uncorrelatedness of the functional principal components is shown to lead to a straightforward implementation of the functional additive model, just based on a sequence of onedimensional smoothing steps and without need for backfitting. This facilitates the theoretical analysis, and we establish asymptotic
A reproducing kernel Hilbert space approach to functional linear regression
 Annals of Statistics
, 2010
"... We study in this paper a smoothness regularization method for functional linear regression and provide a unified treatment for both the prediction and estimation problems. By developing a tool on simultaneous diagonalization of two positive definite kernels, we obtain shaper results on the minimax r ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
We study in this paper a smoothness regularization method for functional linear regression and provide a unified treatment for both the prediction and estimation problems. By developing a tool on simultaneous diagonalization of two positive definite kernels, we obtain shaper results on the minimax rates of convergence and show that smoothness regularized estimators achieve the optimal rates of convergence for both prediction and estimation under conditions weaker than those for the functional principal components based methods developed in the literature. Despite the generality of the method of regularization, we show that the procedure is easily implementable. Numerical results are obtained to illustrate the merits of the method and to demonstrate the theoretical developments. 1. Introduction. Consider
doi:http://dx.doi.org/10.5705/ss.2010.034 A SIMULTANEOUS CONFIDENCE BAND FOR SPARSE LONGITUDINAL REGRESSION
"... Abstract: Functional data analysis has received considerable recent attention and a number of successful applications have been reported. In this paper, asymptotically simultaneous confidence bands are obtained for the mean function of the functional regression model, using piecewise constant spline ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
Abstract: Functional data analysis has received considerable recent attention and a number of successful applications have been reported. In this paper, asymptotically simultaneous confidence bands are obtained for the mean function of the functional regression model, using piecewise constant spline estimation. Simulation experiments corroborate the asymptotic theory. The confidence band procedure is illustrated by analyzing CD4 cell counts of HIV infected patients. Key words and phrases: B spline, confidence band, functional data, KarhunenLoève L 2 representation, knots, longitudinal data, strong approximation. 1.
Bivariate splines for spatial functional regression models
 J. Nonparametr. Stat
, 2010
"... We consider the functional linear regression model where the explanatory variable is a random surface and the response is a real random variable, with bounded or normal noise. Bivariate splines over triangulations represent the random surfaces. We use this representation to construct least square ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
(Show Context)
We consider the functional linear regression model where the explanatory variable is a random surface and the response is a real random variable, with bounded or normal noise. Bivariate splines over triangulations represent the random surfaces. We use this representation to construct least squares estimators of the regression function with or without a penalization term. Under the assumptions that the regressors in the sample are bounded and span a large enough space of functions, bivariate splines approximation properties yield the consistency of the estimators. Simulations demonstrate the quality of the asymptotic properties on a realistic domain. We also carry out an application to ozone concentration forecasting over the US that illustrates the predictive skills of the method. 1
2010b), ‘Deciding the Dimension of Effective Dimension Reduction Space for Functional and HighDimensional Data’, The Annals of Statistics
"... In this paper, we consider regression models with a Hilbertspacevalued predictor and a scalar response, where the response depends on the predictor only through a finite number of projections. The linear subspace spanned by these projections is called the effective dimension reduction (EDR) space ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
In this paper, we consider regression models with a Hilbertspacevalued predictor and a scalar response, where the response depends on the predictor only through a finite number of projections. The linear subspace spanned by these projections is called the effective dimension reduction (EDR) space. To determine the dimensionality of the EDR space, we focus on the leading principal component scores of the predictor, and propose two sequential χ2 testing procedures under the assumption that the predictor has an elliptically contoured distribution. We further extend these procedures and introduce a test that simultaneously takes into account a large number of principal component scores. The proposed procedures are supported by theory, validated by simulation studies, and illustrated by a realdata example. Our methods and theory are applicable to functional data and highdimensional multivariate data. 1. Introduction. Li (1991) considered a regression model in which a scalar response depends on a multivariate predictor through an unknown number of lin
Continuously additive models for nonlinear functional regression
 Biometrika
, 2012
"... We introduce continuously additive models, which can be motivated as extensions of additive regression models with vector predictors to the case of infinitedimensional predictors. This approach provides a class of flexible functional nonlinear regression models, where random predictor curves are c ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
We introduce continuously additive models, which can be motivated as extensions of additive regression models with vector predictors to the case of infinitedimensional predictors. This approach provides a class of flexible functional nonlinear regression models, where random predictor curves are coupled with scalar responses. In continuously additive modeling, integrals taken over a smooth surface along graphs of predictor functions relate the predictors to the responses in a nonlinear fashion. We use tensor product basis expansions to fit the smooth regression surface that characterizes the model. In a theoretical investigation, we show that the predictions obtained from fitting continuously additive models are consistent and asymptotically normal. We also consider extensions to generalized responses. The proposed approach outperforms existing functional regression models in simulations and data illustrations.
Conditional Quantile Analysis When Covariates Are Functions, With Application to Growth Data
, 2011
"... SUMMARY. Motivated by the conditional growth charts problem, we develop a method for conditional quantile analysis when predictors take values in a functional space. The proposed method aims at estimating conditional distribution functions under a generalized functional regression framework. This ap ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
SUMMARY. Motivated by the conditional growth charts problem, we develop a method for conditional quantile analysis when predictors take values in a functional space. The proposed method aims at estimating conditional distribution functions under a generalized functional regression framework. This approach facilitates balancing of model flexibility and the curse of dimensionality for the infinitedimensional functional predictors. Its good performance in comparison with other methods, both for sparsely and densely observed functional covariates, is demonstrated through theory as well as in simulations and an application to growth curves, where the proposed method can, for example, be used to assess the entire growth pattern of a child by relating it to the predicted quantiles of adult height. KEY WORDS:
Local linear regression for functional predictor and scalar response’, Statistics and Econometric Series 15
, 2007
"... The aim of this work is to introduce a new nonparametric regression technique in the context of functional covariate and scalar response. We propose a local linear regression estimator and study its asymptotic behaviour. Its finitesample performance is compared with a NadayaraWatson type kernel re ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
The aim of this work is to introduce a new nonparametric regression technique in the context of functional covariate and scalar response. We propose a local linear regression estimator and study its asymptotic behaviour. Its finitesample performance is compared with a NadayaraWatson type kernel regression estimator via a Monte Carlo study and the analysis of two real data sets. In all the scenarios considered, the local linear regression estimator performs better than the kernel one, in the sense that the mean squared prediction error and its standard deviation are lower.
Using basis expansions for estimating functional PLS regression. Applications with chemometric data
, 2010
"... There are many chemometric applications, such as spectroscopy, where the objective is to explain a scalar response from a functional variable (the spectrum) whose observations are functions of wavelengths rather than vectors. In this paper, PLS regression is considered for estimating the linear mode ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
There are many chemometric applications, such as spectroscopy, where the objective is to explain a scalar response from a functional variable (the spectrum) whose observations are functions of wavelengths rather than vectors. In this paper, PLS regression is considered for estimating the linear model when the predictor is a functional random variable. Due to the infinite dimension of the space to which the predictor observations belong, they are usually approximated by curves/functions within a finite dimensional space spanned by a basis of functions. We show that PLS regression with a functional predictor is equivalent to finite multivariate PLS regression using expansion basis coefficients as the predictor, in the sense that, at each step of the PLS iteration, the same prediction is obtained. In addition, from the linear model estimated using the basis coefficients, we derive the expression of the PLS estimate of the regression coefficient function from the model with a functional predictor. The results provided by this functional PLS approach are compared with those given by functional PCR and discrete PLS and PCR using different sets of simulated and spectrometric data.