Results 11  20
of
67
Nonparametric estimation in functional linear models with second order stationary regressors.
, 901
"... ..."
Rank tests and regression rank score tests in measurement error models
 COMPUTATIONAL STATISTICS AND DATA ANALYSIS
"... ..."
Recursive estimation of nonparametric regression with functional covariate.
, 2012
"... The main purpose of this work is to estimate the regression function of a real random variable with functional explanatory variable by using a recursive nonparametric kernel approach. The mean square error and the almost sure convergence of a family of recursive kernel estimates of the regression fu ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
The main purpose of this work is to estimate the regression function of a real random variable with functional explanatory variable by using a recursive nonparametric kernel approach. The mean square error and the almost sure convergence of a family of recursive kernel estimates of the regression function are derived. These results are established with rates and precise evaluation of the constant terms. Also, a central limit theorem for this class of estimators is established. The method is evaluated on simulations and a real data set study.
On rate optimal local estimation in functional linear model
, 2009
"... We consider the problem of estimating for a given representer h the value ℓh(β) of a linear functional of the slope parameter β in functional linear regression, where scalar responses Y1,..., Yn are modeled in dependence of random functions X1,..., Xn. The proposed estimators of ℓh(β) are based on d ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We consider the problem of estimating for a given representer h the value ℓh(β) of a linear functional of the slope parameter β in functional linear regression, where scalar responses Y1,..., Yn are modeled in dependence of random functions X1,..., Xn. The proposed estimators of ℓh(β) are based on dimension reduction and additional thresholding. The minimax optimal rate of convergence of the estimator is derived assuming that the slope parameter and the representer belong to some ellipsoid which are in a certain sense linked to the covariance operator associated to the regressor. We illustrate these results by considering Sobolev ellipsoids and finitely or infinitely smoothing covariance operator.
Adaptive Global Testing for Functional Linear Models
, 2013
"... This paper studies global testing of the slope function in functional linear regression models. A major challenge in functional global testing is to choose the dimension of projection when approximating the functional regression model by a finite dimensional multivariate linear regression model. We ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper studies global testing of the slope function in functional linear regression models. A major challenge in functional global testing is to choose the dimension of projection when approximating the functional regression model by a finite dimensional multivariate linear regression model. We develop a new method that simultaneously tests the slope vectors in a sequence of functional principal components regression models. The sequence of models being tested is determined by the sample size and is an integral part of the testing procedure. Our theoretical analysis shows that the proposed method is uniformly powerful over a class of smooth alternatives when the signal to noise ratio exceeds the detection boundary. The methods and results reflect the deep connection between the functional linear regression model and the Gaussian sequence model. We also present an extensive simulation study and a real data example to illustrate the finite sample performance of our method.
PENALIZED LIKELIHOOD REGRESSION IN REPRODUCING KERNEL HILBERT SPACES WITH RANDOMIZED COVARIATE DATA
, 2010
"... Classical penalized likelihood regression problems deal with the case that the independent variables data are known exactly. In practice, however, it is common to observe data with incomplete covariate information. We are concerned with a fundamentally important case where some of the observations ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Classical penalized likelihood regression problems deal with the case that the independent variables data are known exactly. In practice, however, it is common to observe data with incomplete covariate information. We are concerned with a fundamentally important case where some of the observations do not represent the exact covariate information, but only a probability distribution. In this case, the maximum penalized likelihood method can be still applied to estimating the regression function. We first show that the maximum penalized likelihood estimate exists under a mild condition. In the computation, we propose a dimension reduction technique to minimize the penalized likelihood and derive a GACV (Generalized Approximate Cross Validation) to choose the smoothing parameter. Our methods are extended to handle more complicated incomplete data problems, such as, covariate measurement error and partially missing covariates. Contents 1 Introduction..............................
Shape Curve Analysis Using Curvature
, 2009
"... Statistical shape analysis is a field for which there is growing demand. One of the major drivers for this growth is the number of practical applications which can use statistical shape analysis to provide useful insight. An example of one of these practical applications is investigating and compari ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Statistical shape analysis is a field for which there is growing demand. One of the major drivers for this growth is the number of practical applications which can use statistical shape analysis to provide useful insight. An example of one of these practical applications is investigating and comparing facial shapes. An ever improving suite of digital imaging technology can capture data on the threedimensional shape of facial features from standard images. A field for which this offers a large amount of potential analytical benefit is the reconstruction of the facial surface of children born with a cleft lip or a cleft lip and palate. This thesis will present two potential methods for analysing data on the facial shape of children who were born with a cleft lip and/or palate using data from two separate studies. One form of analysis will compare the facial shape of one year old children born with a cleft lip and/or palate with the facial shape of control children. The second form of analysis will look for relationships between facial shape and psychological score for ten year old children born with a cleft lip and/or palate. While many of the techniques in this thesis could be extended to
Additive Modeling of Functional Gradients
, 2010
"... We consider the problem of estimating functional derivatives and gradients in the framework of a functional regression setting where one observes functional predictors and scalar responses. Derivatives are then defined as functional directional derivatives which indicate how changes in the predictor ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
We consider the problem of estimating functional derivatives and gradients in the framework of a functional regression setting where one observes functional predictors and scalar responses. Derivatives are then defined as functional directional derivatives which indicate how changes in the predictor function in a specified functional direction are associated with corresponding changes in the scalar response. Aiming at a modelfree approach, navigating the curse of dimension requires to impose suitable structural constraints. Accordingly, we develop functional derivative estimation within an additive regression framework. Here the additive components of functional derivatives correspond to derivatives of nonparametric onedimensional regression functions with the functional principal components of predictor processes as arguments. This approach requires nothing more than estimating derivatives of onedimensional nonparametric regressions, and thus is computationally very straightforward to implement, while it also provides substantial flexibility, fast computation and asymptotic consistency. We demonstrate the estimation and interpretation of the resulting functional derivatives and functional gradient fields in a study of the dependence of lifetime fertility of flies on early life reproductive trajectories.
ResponseAdaptive Regression for Longitudinal Data
, 2010
"... We propose a responseadaptive model for functional linear regression, which is adapted to sparsely sampled longitudinal responses. Our method aims at predicting response trajectories and models the regression relationship by directly conditioning the sparse and irregular observations of the respons ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
We propose a responseadaptive model for functional linear regression, which is adapted to sparsely sampled longitudinal responses. Our method aims at predicting response trajectories and models the regression relationship by directly conditioning the sparse and irregular observations of the response on the predictor, which can be of scalar, vector or functional type. This obliterates the need to model the response trajectories, a task that is challenging for sparse longitudinal data and was previously required for functional regression implementations for longitudinal data. The proposed approach turns out to be superior compared to previous functional regression approaches in terms of prediction error. It encompasses a variety of regression settings that are relevant for the functional modeling of longitudinal data in the life sciences. The improved prediction of response trajectories with the proposed responseadaptive approach is illustrated for a longitudinal study of Kiwi weight growth and by an analysis of the dynamic relationship between viral load and CD4 cell counts observed in AIDS clinical trials.
Nonasymptotic Adaptive Prediction in Functional Linear Models
, 2013
"... Functional linear regression has recently attracted considerable interest. Many works focus on asymptotic inference. In this paper we consider in a non asymptotic framework a simple estimation procedure based on functional Principal Regression. It revolves in the minimization of a least square contr ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Functional linear regression has recently attracted considerable interest. Many works focus on asymptotic inference. In this paper we consider in a non asymptotic framework a simple estimation procedure based on functional Principal Regression. It revolves in the minimization of a least square contrast coupled with a classical projection on the space spanned by the m first empirical eigenvectors of the covariance operator of the functional sample. The novelty of our approach is to select automatically the crucial dimension m by minimization of a penalized least square contrast. Our method is based on model selection tools. Yet, since this kind of methods consists usually in projecting onto known nonrandom spaces, we need to adapt it to empirical eigenbasis made of datadependent – hence random – vectors. The resulting estimator is fully adaptive and is shown to verify an oracle inequality for the risk associated to the prediction error and to attain optimal minimax rates of convergence over a certain class of ellipsoids. Our strategy of model selection is finally compared numerically with crossvalidation.