Results 1  10
of
38
Semiparametrically efficient rankbased inference for shape I: Optimal rankbased tests for sphericity
 Ann. Statist
, 2006
"... A class of Restimators based on the concepts of multivariate signed ranks and the optimal rankbased tests developed in Hallin and Paindaveine [Ann. Statist. 34 (2006)] is proposed for the estimation of the shape matrix of an elliptical distribution. These Restimators are rootn consistent under a ..."
Abstract

Cited by 48 (32 self)
 Add to MetaCart
(Show Context)
A class of Restimators based on the concepts of multivariate signed ranks and the optimal rankbased tests developed in Hallin and Paindaveine [Ann. Statist. 34 (2006)] is proposed for the estimation of the shape matrix of an elliptical distribution. These Restimators are rootn consistent under any radial density g, without any moment assumptions, and semiparametrically efficient at some prespecified density f. When based on normal scores, they are uniformly more efficient than the traditional normaltheory estimator based on empirical covariance matrices (the asymptotic normality of which, moreover, requires finite moments of order four), irrespective of the actual underlying elliptical density. They rely on an original rankbased version of Le Cam’s onestep methodology which avoids the unpleasant nonparametric estimation of crossinformation quantities that is generally required in the context of Restimation. Although they are not strictly affineequivariant, they are shown to be equivariant in a weak asymptotic sense. Simulations confirm their feasibility and excellent finitesample performances. 1. Introduction. 1.1. Rankbased inference for elliptical families. An elliptical density over Rk is determined by a location center θ ∈ Rk, a scale parameter σ ∈ R + 0, a realvalued positive definite symmetric k × k matrix V = (Vij) with V11 = 1,
Structure adaptive approach for dimension reduction
 ANN. STAT
, 2001
"... We propose a new method of effective dimension reduction for a multiindex model which is based on iterative improvement of the family of average derivative estimates. The procedure is computationally straightforward and does not require any prior information about the structure of the underlying mod ..."
Abstract

Cited by 42 (4 self)
 Add to MetaCart
We propose a new method of effective dimension reduction for a multiindex model which is based on iterative improvement of the family of average derivative estimates. The procedure is computationally straightforward and does not require any prior information about the structure of the underlying model. We show that in the case when the effective dimension m of the index space does not exceed 3, this space can be estimated with the rate n −1/2 under rather mild assumptions on the model.
Asymptotic equivalence theory for nonparametric regression with random design
 Ann. Statist
, 2002
"... This paper establishes the global asymptotic equivalence between the nonparametric regression with random design and the white noise under sharp smoothness conditions on an unknown regression or drift function. The asymptotic equivalence is established by constructing explicit equivalence mappings b ..."
Abstract

Cited by 39 (6 self)
 Add to MetaCart
This paper establishes the global asymptotic equivalence between the nonparametric regression with random design and the white noise under sharp smoothness conditions on an unknown regression or drift function. The asymptotic equivalence is established by constructing explicit equivalence mappings between the nonparametric regression and the whitenoise experiments, which provide synthetic observations and synthetic asymptotic solutions from any one of the two experiments with asymptotic properties identical to the true observations and given asymptotic solutions from the other. The impact of such asymptotic equivalence results is that an investigation in one nonparametric problem automatically yields asymptotically analogous results in all other asymptotically equivalent nonparametric problems. 1. Introduction. The
Goodnessoffit testing and quadratic functional estimation from indirect observations
, 2006
"... ..."
ADAPTIVE NONPARAMETRIC CONFIDENCE SETS
, 2006
"... We construct honest confidence regions for a Hilbert spacevalued parameter in various statistical models. The confidence sets can be centered at arbitrary adaptive estimators, and have diameter which adapts optimally to a given selection of models. The latter adaptation is necessarily limited in sc ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
(Show Context)
We construct honest confidence regions for a Hilbert spacevalued parameter in various statistical models. The confidence sets can be centered at arbitrary adaptive estimators, and have diameter which adapts optimally to a given selection of models. The latter adaptation is necessarily limited in scope. We review the notion of adaptive confidence regions, and relate the optimal rates of the diameter of adaptive confidence regions to the minimax rates for testing and estimation. Applications include the finite normal mean model, the white noise model, density estimation and regression with random design.
Best Possible Constant for Bandwidth Selection
 Annals of Statistics, 20, Annals of Statistics
, 1992
"... For the data based choice of the bandwidth of a kernel density estimator, several methods have recently been proposed which have a very fast asymptotic rate of convergence to the optimal bandwidth. In the particular the relative rate of convergence is the square root of the sample size, which is kno ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
For the data based choice of the bandwidth of a kernel density estimator, several methods have recently been proposed which have a very fast asymptotic rate of convergence to the optimal bandwidth. In the particular the relative rate of convergence is the square root of the sample size, which is known to be the possible. The point of this paper is to show how semiparametric arguments can be employed to calculate the best possible constant coefficient, i.e. an analog of the usual Fisher Information, in this convergence. This establishes an important benchmark as to how well bandwidth selection methods can ever hope to perform. It is seen that some methods attain the bound, others do not.
OPTIMAL ADAPTIVE ESTIMATION OF A QUADRATIC FUNCTIONAL
, 2006
"... Adaptive estimation of a quadratic functional over both Besov and Lp balls is considered. A collection of nonquadratic estimators are developed which have useful bias and variance properties over individual Besov and Lp balls. An adaptive procedure is then constructed based on penalized maximization ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
Adaptive estimation of a quadratic functional over both Besov and Lp balls is considered. A collection of nonquadratic estimators are developed which have useful bias and variance properties over individual Besov and Lp balls. An adaptive procedure is then constructed based on penalized maximization over this collection of nonquadratic estimators. This procedure is shown to be optimally rate adaptive over the entire range of Besov and Lp balls in the sense that it attains certain constrained risk bounds.
Nonparametric Estimation of Quadratic Regression Functionals
"... : Quadratic regression functionals are important for the bandwidth selection of nonparametric regression techniques and for nonparametric tests. Based on local polynomial regression, we propose estimators for weighted integrals of squared derivatives of regression functions. The rates of convergence ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
: Quadratic regression functionals are important for the bandwidth selection of nonparametric regression techniques and for nonparametric tests. Based on local polynomial regression, we propose estimators for weighted integrals of squared derivatives of regression functions. The rates of convergence in mean square error are calculated with appropriate values of the smoothing parameter and the amount of smoothness assumed. The asymptotic distribution of the estimators is also considered with the Gaussian noise assumption. It is shown that when the estimators are pseudoquadratic (linear components dominate quadratic components), asymptotic normality with the n \Gamma1 rate can be achieved. Key words and phrases. Asymptotic normality, estimation of quadratic functionals, local polynomial regression. Abbreviated title: Estimation of Quadratic Regression Functionals. AMS 1991 subject classification. Primary 62G07; Secondary 60F05. 1 Introduction Let (X 1 ; Y 1 ); : : : ; (X n ; Y n ) ...
SHARP ADAPTIVE ESTIMATION OF THE DRIFT FUNCTION FOR ERGODIC DIFFUSIONS
, 2005
"... The global estimation problem of the drift function is considered for a large class of ergodic diffusion processes. The unknown drift S(·) is supposed to belong to a nonparametric class of smooth functions of order k ≥ 1, but the value of k is not known to the statistician. A fully datadriven proce ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
The global estimation problem of the drift function is considered for a large class of ergodic diffusion processes. The unknown drift S(·) is supposed to belong to a nonparametric class of smooth functions of order k ≥ 1, but the value of k is not known to the statistician. A fully datadriven procedure of estimating the drift function is proposed, using the estimated risk minimization method. The sharp adaptivity of this procedure is proven up to an optimal constant, when the quality of the estimation is measured by the integrated squared error weighted by the square of the invariant density.
Sharp Adaptive Estimation of Quadratic Functionals
"... Estimation of a quadratic functional of a function observed in Gaussian white noise is considered. A datadependent method for choosing the amount of smoothing is given. It is shown that the method is asymptotically sharp adaptive simultanuosly for the "regular" and "irregular" c ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
Estimation of a quadratic functional of a function observed in Gaussian white noise is considered. A datadependent method for choosing the amount of smoothing is given. It is shown that the method is asymptotically sharp adaptive simultanuosly for the "regular" and "irregular" case. The method is based on applying the Lepski method to choose between certain quadratic estimators. These quadratic estimators are given by the theory of optimal recovery and they have the same form as the estimators which are minimax optimal among quadratic estimators. AMS 1991 subject classi cations. Primary 62G07; secondary 62G20.