Results 1  10
of
20
Tight conditions for consistent variable selection in high dimensional nonparametric regression
"... ..."
(Show Context)
Minimax hypothesis testing for curve registration
 Electron. J. Statist
"... Abstract: This paper is concerned with the problem of goodnessoffit for curve registration, and more precisely for the shifted curve model, whose application field reaches from computer vision and road traffic prediction to medicine. We give bounds for the asymptotic minimax separation rate, when ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
(Show Context)
Abstract: This paper is concerned with the problem of goodnessoffit for curve registration, and more precisely for the shifted curve model, whose application field reaches from computer vision and road traffic prediction to medicine. We give bounds for the asymptotic minimax separation rate, when the functions in the alternative lie in Sobolev balls and the separation from the null hypothesis is measured by the l2norm. We use the generalized likelihood ratio to build a nonadaptive procedure depending on a tuning parameter, which we choose in an optimal way according to the smoothness of the ambient space. Then, a Bonferroni procedure is applied to give an adaptive test over a range of Sobolev balls. Both achieve the asymptotic minimax separation rates, up to possible logarithmic factors.
Asymptotic equivalence of functional linear regression and a white noise inverse problem
 Ann. Statist
"... ar ..."
(Show Context)
Lower bounds for volatility estimation in microstructure noise models. In Borrowing Strength: Theory Powering Applications— A Festschrift for Lawrence
 Ann. Statist
, 2010
"... In this paper minimax lower bounds are derived for the estimation of the instantaneous volatility in three related highfrequency statistical models. These bounds are based on new upper bounds for the KullbackLeibler divergence between two multivariate normal random variables along with a spectra ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
In this paper minimax lower bounds are derived for the estimation of the instantaneous volatility in three related highfrequency statistical models. These bounds are based on new upper bounds for the KullbackLeibler divergence between two multivariate normal random variables along with a spectral analysis of the processes. A comparison with known upper bounds shows that these lower bounds are optimal. Our major finding is that the Gaussian microstructure noise introduces an additional degree of illposedness for each model, respectively.
ASYMPTOTIC EQUIVALENCE FOR INHOMOGENEOUS JUMP DIFFUSION PROCESSES AND WHITE NOISE.
"... Abstract. We prove the global asymptotic equivalence between the experiments generated by the discrete (high frequency) or continuous observation of a path of a time inhomogeneous jumpdiffusion process and a Gaussian white noise experiment. Here, the considered parameter is the drift function, and ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract. We prove the global asymptotic equivalence between the experiments generated by the discrete (high frequency) or continuous observation of a path of a time inhomogeneous jumpdiffusion process and a Gaussian white noise experiment. Here, the considered parameter is the drift function, and we suppose that the observation time T tends to ∞. The approximation is given in the sense of the Le Cam ∆distance, under smoothness conditions on the unknown drift function. These asymptotic equivalences are established by constructing explicit Markov kernels that can be used to reproduce one experiment from the other. 1.
Submitted arXiv:math.ST/1106.4293v2 TIGHT CONDITIONS FOR CONSISTENCY OF VARIABLE SELECTION IN THE CONTEXT OF HIGH DIMENSIONALITY
, 2012
"... We address the issue of variable selection in the regression model with very high ambient dimension, i.e., when the number of variables is very large. The main focus is on the situation where the number of relevant variables, called intrinsic dimension and denoted by d ∗ , is much smaller than the a ..."
Abstract
 Add to MetaCart
(Show Context)
We address the issue of variable selection in the regression model with very high ambient dimension, i.e., when the number of variables is very large. The main focus is on the situation where the number of relevant variables, called intrinsic dimension and denoted by d ∗ , is much smaller than the ambient dimension d. Without assuming any parametric form of the underlying regression function, we get tight conditions making it possible to consistently estimate the set of relevant variables. These conditions relate the intrinsic dimension to the ambient dimension and to the sample size. The procedure that is provably consistent under these tight conditions is based on comparing quadratic functionals of the empirical Fourier coefficients with appropriately chosen threshold values. The asymptotic analysis reveals the presence of two quite different regimes. The first regime is when d ∗ is fixed. In this case the situation in nonparametric regression is the same as in linear regression, i.e., consistent variable selection is possible if and only if logd is small compared to the sample size n. The picture is different in the second regime, d ∗ →∞as n→∞, where we prove that consistent variable selection in nonparametric setup is possible only if d ∗ + loglogd is small compared to logn. We apply these results to derive minimax separation rates for the problem of variable selection. 1. Introduction. Realworld
Electronic Journal of Statistics
, 2012
"... Minimax hypothesis testing for curve ..."
(Show Context)