Results 11  20
of
105
Estimation of Functional Derivatives
, 2009
"... Situations of a functional predictor paired with a scalar response are increasingly encountered in data analysis. Predictors are often appropriately modeled as square integrable smooth random functions. Imposing minimal assumptions on the nature of the functional relationship, we aim to estimate the ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
Situations of a functional predictor paired with a scalar response are increasingly encountered in data analysis. Predictors are often appropriately modeled as square integrable smooth random functions. Imposing minimal assumptions on the nature of the functional relationship, we aim to estimate the directional derivatives and gradients of the response with respect to the predictor functions. In statistical applications and data analysis, functional derivatives provide a quantitative measure of the often intricate relationship between changes in predictor trajectories and those in scalar responses. This approach provides a natural extension of classical gradient fields in vector space and provides directions of steepest descent. We suggest a kernelbased method for the nonparametric estimation of functional derivatives that utilizes the decomposition of the random predictor functions into their eigenfunctions. These eigenfunctions define a canonical set of directions into which the gradient field is expanded. The proposed method is shown to lead to asymptotically consistent estimates of functional derivatives and is illustrated in an application to growth curves.
Theoretical properties of projection based multilayer perceptrons with functional inputs
 NEURAL PROCESSING LETTERS 23(1) (2006) 55–70
, 2006
"... Many real world data are sampled functions. As shown by Functional Data Analysis (FDA) methods, spectra, time series, images, gesture recognition data, etc. can be processed more efficiently if their functional nature is taken into account during the data analysis process. This is done by extending ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
Many real world data are sampled functions. As shown by Functional Data Analysis (FDA) methods, spectra, time series, images, gesture recognition data, etc. can be processed more efficiently if their functional nature is taken into account during the data analysis process. This is done by extending standard data analysis methods so that they can apply to functional inputs. A general way to achieve this goal is to compute projections of the functional data onto a finite dimensional subspace of the functional space. The coordinates of the data on a basis of this subspace provide standard vector representations of the functions. The obtained vectors can be processed by any standard method. In [43], this general approach has been used to define projection based Multilayer Perceptrons (MLPs) with functional inputs. We study in this paper important theoretical properties of the proposed model. We show in particular that MLPs with functional inputs are universal approximators: they can approximate to arbitrary accuracy any continuous mapping from a compact subspace of a functional space to IR. Moreover, we provide a consistency result that shows that any mapping from a functional space to IR can be learned thanks to examples by a projection based MLP: the generalization mean square error of the MLP decreases to the smallest possible mean square error on the data when the number of examples goes to infinity.
Penalized Partial Least Squares with Applications to BSpline Transformations and Functional Data
, 2008
"... We propose a novel framework that combines penalization techniques with Partial Least Squares (PLS). We focus on two important applications. (1) We combine PLS with a roughness penalty to estimate highdimensional regression problems with functional predictors and scalar response. (2) Starting with ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
We propose a novel framework that combines penalization techniques with Partial Least Squares (PLS). We focus on two important applications. (1) We combine PLS with a roughness penalty to estimate highdimensional regression problems with functional predictors and scalar response. (2) Starting with an additive model, we expand each variable in terms of a generous number of BSpline basis functions. To prevent overfitting, we estimate the model by applying a penalized version of PLS. We gain additional model flexibility by incorporating a sparsity penalty. Both applications can be formulated in terms of a unified algorithm called Penalized Partial Least Squares, which can be computed virtually as fast as PLS using the kernel trick. Furthermore, we prove a close connection of penalized PLS to preconditioned linear systems. In experiments, we show the benefits of our method to noisy functional data and to sparse nonlinear regression models.
Quantile regression when the covariates are functions. Preprint. 9, 2005 23:44 WSPC/Trim Size: 9in x 6in for Review Volume bickel6 Regressing Longitudinal Response Trajectories on a Covariate 19
, 2005
"... This article deals with a linear model of regression on quantiles when the explanatory variable takes values in some functional space and the response is scalar. We propose a spline estimator of the functional coefficient that minimizes a penalized L1 type criterion. Then, we study the asymptotic be ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
This article deals with a linear model of regression on quantiles when the explanatory variable takes values in some functional space and the response is scalar. We propose a spline estimator of the functional coefficient that minimizes a penalized L1 type criterion. Then, we study the asymptotic behavior of this estimator. The penalization is of primary importance to get existence and convergence.
Bivariate splines for spatial functional regression models
 J. Nonparametr. Stat
, 2010
"... We consider the functional linear regression model where the explanatory variable is a random surface and the response is a real random variable, with bounded or normal noise. Bivariate splines over triangulations represent the random surfaces. We use this representation to construct least square ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
(Show Context)
We consider the functional linear regression model where the explanatory variable is a random surface and the response is a real random variable, with bounded or normal noise. Bivariate splines over triangulations represent the random surfaces. We use this representation to construct least squares estimators of the regression function with or without a penalization term. Under the assumptions that the regressors in the sample are bounded and span a large enough space of functions, bivariate splines approximation properties yield the consistency of the estimators. Simulations demonstrate the quality of the asymptotic properties on a realistic domain. We also carry out an application to ozone concentration forecasting over the US that illustrates the predictive skills of the method. 1
Functional Additive Regression
, 2011
"... We suggest a new method, called “Functional Additive Regression”, or FAR, for efficiently performing high dimensional functional regression. FAR extends the usual linear regression model involving a functional predictor, X(t), and a scalar response, Y, in two key respects. First, FAR uses a penalize ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
We suggest a new method, called “Functional Additive Regression”, or FAR, for efficiently performing high dimensional functional regression. FAR extends the usual linear regression model involving a functional predictor, X(t), and a scalar response, Y, in two key respects. First, FAR uses a penalized least squares optimization approach to efficiently deal with high dimensional problems involving a large number of different functional predictors. Second, FAR extends beyond the standard linear regression setting to fit general nonlinear additive models. We demonstrate that FAR can be implemented with a wide range of penalty functions using a highly efficient coordinate descent algorithm. Theoretical results are developed which provide motivation for the FAR optimization criterion. Finally, we show through simulations and two real data sets that FAR can significantly outperform competing methods.
Asymptotic equivalence of functional linear regression and a white noise inverse problem
 Ann. Statist
"... ar ..."
Testing for No Effect in Functional Linear Regression Models, Some Computational Approaches
"... The functional linear regression model is a regression model where the link between the response (a scalar) and the predictor (a random function) is expressed as an inner product between a functional coefficient and the predictor. Our aim is to test at first for no effect of the model, i.e. the n ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
The functional linear regression model is a regression model where the link between the response (a scalar) and the predictor (a random function) is expressed as an inner product between a functional coefficient and the predictor. Our aim is to test at first for no effect of the model, i.e. the nullity of the functional coecient. A fully automatic permutation test based on the cross covariance operator of the predictor and the response is proposed. The model
2010b), ‘Deciding the Dimension of Effective Dimension Reduction Space for Functional and HighDimensional Data’, The Annals of Statistics
"... In this paper, we consider regression models with a Hilbertspacevalued predictor and a scalar response, where the response depends on the predictor only through a finite number of projections. The linear subspace spanned by these projections is called the effective dimension reduction (EDR) space ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
In this paper, we consider regression models with a Hilbertspacevalued predictor and a scalar response, where the response depends on the predictor only through a finite number of projections. The linear subspace spanned by these projections is called the effective dimension reduction (EDR) space. To determine the dimensionality of the EDR space, we focus on the leading principal component scores of the predictor, and propose two sequential χ2 testing procedures under the assumption that the predictor has an elliptically contoured distribution. We further extend these procedures and introduce a test that simultaneously takes into account a large number of principal component scores. The proposed procedures are supported by theory, validated by simulation studies, and illustrated by a realdata example. Our methods and theory are applicable to functional data and highdimensional multivariate data. 1. Introduction. Li (1991) considered a regression model in which a scalar response depends on a multivariate predictor through an unknown number of lin