Results 1  10
of
25
Asymptotic equivalence of density estimation and Gaussian white noise
 Ann. Statist
, 1996
"... Signal recovery in Gaussian white noise with variance tending to zero has served for some time as a representative model for nonparametric curve estimation, having all the essential traits in a pure form. The equivalence has mostly been stated informally, but an approximation in the sense of Le Cam’ ..."
Abstract

Cited by 123 (5 self)
 Add to MetaCart
Signal recovery in Gaussian white noise with variance tending to zero has served for some time as a representative model for nonparametric curve estimation, having all the essential traits in a pure form. The equivalence has mostly been stated informally, but an approximation in the sense of Le Cam’s deficiency distance ∆ would make it precise. The models are then asymptotically equivalent for all purposes of statistical decision with bounded loss. In nonparametrics, a first result of this kind has recently been established for Gaussian regression (Brown and Low, 1993). We consider the analogous problem for the experiment given by n i. i. d. observations having density f on the unit interval. Our basic result concerns the parameter space of densities which are in a Hölder ball with exponent α> 12 and which are uniformly bounded away from zero. We show that an i. i. d. sample of size n with density f is globally asymptotically equivalent to a white noise experiment with drift f1/2 and variance 14n −1. This represents a nonparametric analog of Le Cam’s heteroscedastic Gaussian approximation in the finite dimensional case.
EQUIVALENCE THEORY FOR DENSITY ESTIMATION, POISSON PROCESSES AND GAUSSIAN WHITE NOISE WITH DRIFT
"... This paper establishes the global asymptotic equivalence between a Poisson process with variable intensity and white noise with drift under sharp smoothness conditions on the unknown function. This equivalence is also extended to density estimation models by Poissonization. The asymptotic equivalenc ..."
Abstract

Cited by 25 (4 self)
 Add to MetaCart
This paper establishes the global asymptotic equivalence between a Poisson process with variable intensity and white noise with drift under sharp smoothness conditions on the unknown function. This equivalence is also extended to density estimation models by Poissonization. The asymptotic equivalences are established by constructing explicit equivalence mappings. The impact of such asymptotic equivalence results is that an investigation in one of these nonparametric models automatically yields asymptotically analogous results in the other models. 1. Introduction. The
The root–unroot algorithm for density estimation as implemented via wavelet block thresholding
, 2010
"... ..."
Deficiency distance between multinomial and multivariate normal experiments. Submitted to The Annals of Statistics
"... Deficiency distance bounds between largedimensional multinomials and multivariate normals can be used to establish asymptotic equivalence of densityestimation experiments and the appropriate Gaussian experiment. Nussbaum established the equivalence of these nonparametric experiments for sufficie ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
Deficiency distance bounds between largedimensional multinomials and multivariate normals can be used to establish asymptotic equivalence of densityestimation experiments and the appropriate Gaussian experiment. Nussbaum established the equivalence of these nonparametric experiments for sufficiently smooth classes of densities. To establish the equivalence of the multinomial and multivariate normal with dimensions large enough to approximate the nonparametric experiments, a coupling technique is needed that makes use of the smoothness of the underlying densities. The resulting bound on the deficiency distance is independent of the dimension of the multinomial, and therefore includes density estimation experiments as a limiting case. This work extends existing bounds that relied only on centrallimit tendencies of the multinomials.
Optimal Rates of Convergence for Estimating Toeplitz Covariance Matrices
"... Toeplitz covariance matrices are used in the analysis of stationary stochastic processes and a wide range of applications including radar imaging, target detection, speech recognition, and communications systems. In this paper, we consider optimal estimation of large Toeplitz covariance matrices and ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
Toeplitz covariance matrices are used in the analysis of stationary stochastic processes and a wide range of applications including radar imaging, target detection, speech recognition, and communications systems. In this paper, we consider optimal estimation of large Toeplitz covariance matrices and establish the minimax rate of convergence for two commonly used parameter spaces under the spectral norm. The properties of the tapering and banding estimators are studied in detail and are used to obtain the minimax upper bound. The results also reveal a fundamental difference between the tapering and banding estimators over certain parameter spaces. The minimax lower bound is derived through a novel construction of a more informative experiment for which the minimax lower bound is obtained through an equivalent Gaussian scale model and through a careful selection of a finite collection of least favorable parameters. In addition, optimal rate of convergence for estimating the inverse of a Toeplitz covariance matrix is also established.
Model selection and sharp asymptotic minimaxity
, 2010
"... We obtain sharp minimax results for estimation of an n dimensional normal mean under quadratic loss. The estimators are chosen by penalized least squares with a penalty that grows like ck log(n=k), for k equal to the number of nonzero elements in the estimating vector. For a wide range of sparse par ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
We obtain sharp minimax results for estimation of an n dimensional normal mean under quadratic loss. The estimators are chosen by penalized least squares with a penalty that grows like ck log(n=k), for k equal to the number of nonzero elements in the estimating vector. For a wide range of sparse parameter spaces, we show that the penalized estimator achieves the exact minimax rate with the correct multiplication constant if and only if c equals 2. Our results unify the theory obtained by many other authors for penalized estimation of normal means. In particular we establish that a conjecture by Abramovich, Benjamini, Donoho and Johnstone (2006) is true.
Statistical Properties of the Method of Regularization with Periodic Gaussian Reproducing Kernel
 Annals of Statistics
, 2004
"... The method of regularization with the Gaussian reproducing kernel is popular in the machine learning literature and successful in many practical applications. In this paper we consider the periodic version of the Gaussian kernel regularization. We show in the white noise model setting, that in funct ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
The method of regularization with the Gaussian reproducing kernel is popular in the machine learning literature and successful in many practical applications. In this paper we consider the periodic version of the Gaussian kernel regularization. We show in the white noise model setting, that in function spaces of very smooth functions, such as the infiniteorder Sobolev space and the space of analytic functions, the method under consideration is asymptotically minimax; in finiteorder Sobolev spaces, the method is rate optimal, and the efficiency in terms of constant when compared with the minimax estimator is reasonably high. The smoothing parameters in the periodic Gaussian regularization can be chosen adaptively without loss of asymptotic efficiency. The results derived in this paper give a partial explanation of the success of the Gaussian reproducing kernel in practice. Simulations are carried out to study the finite sample properties of the periodic Gaussian regularization. 1. Introduction. The
Lower bounds for volatility estimation in microstructure noise models. In Borrowing Strength: Theory Powering Applications— A Festschrift for Lawrence
 Ann. Statist
, 2010
"... In this paper minimax lower bounds are derived for the estimation of the instantaneous volatility in three related highfrequency statistical models. These bounds are based on new upper bounds for the KullbackLeibler divergence between two multivariate normal random variables along with a spectra ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
In this paper minimax lower bounds are derived for the estimation of the instantaneous volatility in three related highfrequency statistical models. These bounds are based on new upper bounds for the KullbackLeibler divergence between two multivariate normal random variables along with a spectral analysis of the processes. A comparison with known upper bounds shows that these lower bounds are optimal. Our major finding is that the Gaussian microstructure noise introduces an additional degree of illposedness for each model, respectively.
ASYMPTOTIC EQUIVALENCE AND ADAPTIVE ESTIMATION FOR ROBUST NONPARAMETRIC REGRESSION
, 2009
"... Asymptotic equivalence theory developed in the literature so far are only for bounded loss functions. This limits the potential applications of the theory because many commonly used loss functions in statistical inference are unbounded. In this paper we develop asymptotic equivalence results for rob ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
Asymptotic equivalence theory developed in the literature so far are only for bounded loss functions. This limits the potential applications of the theory because many commonly used loss functions in statistical inference are unbounded. In this paper we develop asymptotic equivalence results for robust nonparametric regression with unbounded loss functions. The results imply that all the Gaussian nonparametric regression procedures can be robustified in a unified way. A key step in our equivalence argument is to bin the data and then take the median of each bin. The asymptotic equivalence results have significant practical implications. To illustrate the general principles of the equivalence argument we consider two important nonparametric inference problems: robust estimation of the regression function and the estimation of a quadratic functional. In both cases easily implementable procedures are constructed and are shown to enjoy simultaneously a high degree of robustness and adaptivity. Other problems such as construction of confidence sets and nonparametric hypothesis testing can be handled in a similar fashion.
A note on quantile coupling inequalities and their applications
 Submitted. Available from www.stat.yale.edu/˜hz68
, 2006
"... A relationship between the large deviation and quantile coupling is studied. We apply this relationship to the coupling of the sum of n i.i.d. symmetric random variables with a normal random variable, improving the classical quantile coupling inequalities (the key part in the celebrated KMT construc ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
A relationship between the large deviation and quantile coupling is studied. We apply this relationship to the coupling of the sum of n i.i.d. symmetric random variables with a normal random variable, improving the classical quantile coupling inequalities (the key part in the celebrated KMT constructions) with a rate 1 = p n for random variables with continuous distributions, or the same rate modulo constants for the general case. Applications to the asymptotic equivalence theory and nonparametric function estimation are discussed.