Results 1 
6 of
6
Adaptive Prediction and Estimation in Linear Regression With Infinitely Many Parameters
 Ann. Statist.,29
, 2001
"... The problem of adaptive prediction and estimation in the stochastic linear regression model with infinitely many parameters is considered. We suggest a prediction method that is sharp asymptotically minimax adaptive over ellipsoids in ` 2 . The method consists in an application of blockwise Stein&ap ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
The problem of adaptive prediction and estimation in the stochastic linear regression model with infinitely many parameters is considered. We suggest a prediction method that is sharp asymptotically minimax adaptive over ellipsoids in ` 2 . The method consists in an application of blockwise Stein's rule with "weakly" geometrically increasing blocks to the penalized least squares fits of the first N coefficients. To prove the results we develop oracle inequalities for sequence model with correlated data. Mathematics Subject Classifications: 62G05, 62G20 Key words: Linear regression with infinitely many parameters, adaptive prediction, exact asymptotics of minimax risk, blockwise Stein's rule, oracle inequalities. Short title: Adaptive prediction 1 Introduction Consider the regression model y = 1 X k=1 ` k x k + " (1) where fx k g k=1;2;::: is a sequence of explanatory variables, y is the corresponding response, " is the error, and ` = (` 1 ; ` 2 ; : : :) 2 ` 2 is an unknown regre...
Optimal estimation and prediction for dense signals in highdimensional linear models
, 2012
"... ar ..."
Dense signals, linear estimators, and outofsample prediction for highdimensional linear models. arXiv preprint arXiv:1102.2952
, 2011
"... ar ..."
(Show Context)
Adaptive functional linear regression Fabienne Comte∗
"... Université catholique de Louvain We consider the estimation of the slope function in functional linear regression, where scalar responses are modeled in dependence of random functions. Cardot and Johannes [2010] have shown that a thresholded projection estimator can attain up to a constant minimaxr ..."
Abstract
 Add to MetaCart
Université catholique de Louvain We consider the estimation of the slope function in functional linear regression, where scalar responses are modeled in dependence of random functions. Cardot and Johannes [2010] have shown that a thresholded projection estimator can attain up to a constant minimaxrates of convergence in a general framework which allows to cover the prediction problem with respect to the mean squared prediction error as well as the estimation of the slope function and its derivatives. This estimation procedure, however, requires an optimal choice of a tuning parameter with regard to certain characteristics of the slope function and the covariance operator associated with the functional regressor. As this information is usually inaccessible in practice, we investigate a fully datadriven choice of the tuning parameter which combines model selection and Lepski’s method. It is inspired by the recent work of Goldenshluger and Lepski [2011]. The tuning parameter is selected as minimizer of a stochastic penalized contrast function imitating Lepski’s method among a random collection of admissible values. This choice of the tuning parameter depends only on the data and we show that within the general framework the resulting datadriven thresholded projection estimator can attain minimaxrates up to a constant over a variety of classes of slope functions and covariance operators. The results are illustrated considering different configurations which cover in particular the prediction problem as well as the estimation of the slope and its derivatives.
unknown title
, 802
"... Evaluation and selection of models for outofsample prediction when the sample size is small relative to the complexity of the datagenerating process ..."
Abstract
 Add to MetaCart
Evaluation and selection of models for outofsample prediction when the sample size is small relative to the complexity of the datagenerating process