Results 1  10
of
22
Minimax Estimation via Wavelet Shrinkage
, 1992
"... We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minim ..."
Abstract

Cited by 322 (32 self)
 Add to MetaCart
We attempt to recover an unknown function from noisy, sampled data. Using orthonormal bases of compactly supported wavelets we develop a nonlinear method which works in the wavelet domain by simple nonlinear shrinkage of the empirical wavelet coe cients. The shrinkage can be tuned to be nearly minimax over any member of a wide range of Triebel and Besovtype smoothness constraints, and asymptotically minimax over Besov bodies with p q. Linear estimates cannot achieve even the minimax rates over Triebel and Besov classes with p <2, so our method can signi cantly outperform every linear method (kernel, smoothing spline, sieve,:::) in a minimax sense. Variants of our method based on simple threshold nonlinearities are nearly minimax. Our method possesses the interpretation of spatial adaptivity: it reconstructs using a kernel which mayvary in shape and bandwidth from point to point, depending on the data. Least favorable distributions for certain of the Triebel and Besov scales generate objects with sparse wavelet transforms. Many real objects have similarly sparse transforms, which suggests that these minimax results are relevant for practical problems. Sequels to this paper discuss practical implementation, spatial adaptation properties and applications to inverse problems.
Wavelet shrinkage: asymptopia
 Journal of the Royal Statistical Society, Ser. B
, 1995
"... Considerable e ort has been directed recently to develop asymptotically minimax methods in problems of recovering in nitedimensional objects (curves, densities, spectral densities, images) from noisy data. A rich and complex body of work has evolved, with nearly or exactly minimax estimators bein ..."
Abstract

Cited by 297 (36 self)
 Add to MetaCart
(Show Context)
Considerable e ort has been directed recently to develop asymptotically minimax methods in problems of recovering in nitedimensional objects (curves, densities, spectral densities, images) from noisy data. A rich and complex body of work has evolved, with nearly or exactly minimax estimators being obtained for a variety of interesting problems. Unfortunately, the results have often not been translated into practice, for a variety of reasons { sometimes, similarity to known methods, sometimes, computational intractability, and sometimes, lack of spatial adaptivity. We discuss a method for curve estimation based on n noisy data; one translates the empirical wavelet coe cients towards the origin by an amount p p 2 log(n) = n. The method is di erent from methods in common use today, is computationally practical, and is spatially adaptive; thus it avoids a number of previous objections to minimax estimators. At the same time, the method is nearly minimax for a wide variety of loss functions { e.g. pointwise error, global error measured in L p norms, pointwise and global error in estimation of derivatives { and for a wide range of smoothness classes, including standard Holder classes, Sobolev classes, and Bounded Variation. This is amuch broader nearoptimality than anything previously proposed in the minimax literature. Finally, the theory underlying the method is interesting, as it exploits a correspondence between statistical questions and questions of optimal recovery and informationbased complexity.
InformationTheoretic Determination of Minimax Rates of Convergence
 Ann. Stat
, 1997
"... In this paper, we present some general results determining minimax bounds on statistical risk for density estimation based on certain informationtheoretic considerations. These bounds depend only on metric entropy conditions and are used to identify the minimax rates of convergence. ..."
Abstract

Cited by 158 (24 self)
 Add to MetaCart
In this paper, we present some general results determining minimax bounds on statistical risk for density estimation based on certain informationtheoretic considerations. These bounds depend only on metric entropy conditions and are used to identify the minimax rates of convergence.
Minimax bayes, asymptotic minimax and sparse wavelet priors, in
 Sciences Paris (A
, 1994
"... Pinsker(1980) gave a precise asymptotic evaluation of the minimax mean squared error of estimation of a signal in Gaussian noise when the signal is known a priori to lie in a compact ellipsoid in Hilbert space. This `Minimax Bayes ' method can be applied to a variety of global nonparametric es ..."
Abstract

Cited by 46 (10 self)
 Add to MetaCart
Pinsker(1980) gave a precise asymptotic evaluation of the minimax mean squared error of estimation of a signal in Gaussian noise when the signal is known a priori to lie in a compact ellipsoid in Hilbert space. This `Minimax Bayes ' method can be applied to a variety of global nonparametric estimation settings with parameter spaces far from ellipsoidal. For example it leads to a theory of exact asymptotic minimax estimation over norm balls in Besov and Triebel spaces using simple coordinatewise estimators and wavelet bases. This paper outlines some features of the method common to several applications. In particular, we derive new results on the exact asymptotic minimax risk over weak `p balls in Rn as n!1, and also for a class of `local ' estimators on the Triebel scale. By its very nature, the method reveals the structure of asymptotically least favorable distributions. Thus wemaysimulate `least favorable ' sample paths. We illustrate this for estimation of a signal in Gaussian white noise over norm balls in certain Besov spaces. In wavelet bases, when p<2, the least favorable priors are sparse, and the resulting sample paths strikingly di erent from those observed in Pinsker's ellipsoidal setting (p =2).
Thresholding Estimators for Linear Inverse Problems and Deconvolutions
, 2003
"... Thresholding algorithms in an orthonormal basis are studied to estimate noisy discrete signals degraded by a linear operator whose inverse is not bounded. For signals in a set Theta, sufficient conditions are established on the basis to obtain a maximum risk with minimax rates of convergence. Deconv ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
Thresholding algorithms in an orthonormal basis are studied to estimate noisy discrete signals degraded by a linear operator whose inverse is not bounded. For signals in a set Theta, sufficient conditions are established on the basis to obtain a maximum risk with minimax rates of convergence. Deconvolutions with kernels having a Fourier transform which vanishes at high frequencies are examples of unstable inverse problems, where a thresholding in a wavelet basis is a suboptimal estimator. A new "mirror wavelet" basis is constructed to obtain a deconvolution risk which is proved to be asymptotically equivalent to the minimax risk over bounded variation signals. This thresholding estimator is used to restore blurred satellite images.
Density and Hazard Rate Estimation for Right Censored Data Using Wavelet Methods
, 1997
"... This paper describes a wavelet method for the estimation of density and hazard rate functions from randomly right censored data. We adopt a nonparametric approach in assuming that the density and hazard rate have no specific parametric form. The method is based on dividing the time axis into a dyadi ..."
Abstract

Cited by 32 (4 self)
 Add to MetaCart
This paper describes a wavelet method for the estimation of density and hazard rate functions from randomly right censored data. We adopt a nonparametric approach in assuming that the density and hazard rate have no specific parametric form. The method is based on dividing the time axis into a dyadic number of intervals and then counting the number of events within each interval. The number of events and the survival function of the observations are then separately smoothed over time via linear wavelet smoothers, and then the hazard rate function estimators are obtained by taking the ratio. We prove that the estimators possess pointwise and global mean square consistency, obtain the best possible asymptotic MISE convergence rate and are also asymptotically normally distributed. We also describe simulation experiments that show these estimators are reasonably reliable in practice. The method is illustrated with two real examples. The first uses survival time data for patients with liver...
Information Theoretic Methods in Probability and Statistics
, 2001
"... Ideas of information theory have found fruitful applications not only in various fields of science and engineering but also within mathematics, both pure and applied. This is illustrated by several typical applications of information theory specifically in probability and statistics. ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
Ideas of information theory have found fruitful applications not only in various fields of science and engineering but also within mathematics, both pure and applied. This is illustrated by several typical applications of information theory specifically in probability and statistics.
Adaptive nonparametric estimation in heteroscedastic regression models. Part 1. Sharp nonasymptotic oracle inequalities
 Preprint of the Strasbourg 40 Pasteur University, IRMA, 2007/09 (2007), available at http://hal.archivesouvertes.fr/hal/00179856/fr
"... In the paper we study asymptotic properties of the adaptive procedure proposed in the paper Galtchouk, Pergamenshchikov, 2007, for nonparametric estimation of unknown regression. We prove that this procedure is asymptotically efficient for some quadratic risk, i.e. we show that the asymptotic quadra ..."
Abstract

Cited by 19 (8 self)
 Add to MetaCart
In the paper we study asymptotic properties of the adaptive procedure proposed in the paper Galtchouk, Pergamenshchikov, 2007, for nonparametric estimation of unknown regression. We prove that this procedure is asymptotically efficient for some quadratic risk, i.e. we show that the asymptotic quadratic risk for this procedure coincides with the Pinsker constant which gives a sharp lower bound for quadratic risk over all possible estimates.
Universal Near Minimaxity of Wavelet Shrinkage
, 1995
"... We discuss a method for curve estimation based on n noisy data; one translates the empirical wavelet coefficients towards the origin by an amount p 2 log(n) \Delta oe= p n. The method is nearly minimax for a wide variety of loss functions  e.g. pointwise error, global error measured in L p ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
We discuss a method for curve estimation based on n noisy data; one translates the empirical wavelet coefficients towards the origin by an amount p 2 log(n) \Delta oe= p n. The method is nearly minimax for a wide variety of loss functions  e.g. pointwise error, global error measured in L p norms, pointwise and global error in estimation of derivatives  and for a wide range of smoothness classes, including standard Holder classes, Sobolev classes, and Bounded Variation. This is a broader nearoptimality than anything previously proposed in the minimax literature. The theory underlying the method exploits a correspondence between statistical questions and questions of optimal recovery and informationbased complexity. This paper contains a detailed proof of the result announced in Donoho, Johnstone, Kerkyacharian & Picard (1995).
Nonparametric estimation of composite functions
 Ann. Stat
, 2009
"... We study the problem of nonparametric estimation of a multivariate function g:R d → R that can be represented as a composition of two unknown smooth functions f:R→R and G:R d → R. We suppose that f and G belong to known smoothness classes of functions, with smoothness γ and β, respectively. We obtai ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
We study the problem of nonparametric estimation of a multivariate function g:R d → R that can be represented as a composition of two unknown smooth functions f:R→R and G:R d → R. We suppose that f and G belong to known smoothness classes of functions, with smoothness γ and β, respectively. We obtain the full description of minimax rates of estimation of g in terms of γ and β, and propose rateoptimal estimators for the supnorm loss. For the construction of such estimators, we first prove an approximation result for composite functions that may have an independent interest, and then a result on adaptation to the local structure. Interestingly, the construction of rateoptimal estimators for composite functions (with given, fixed smoothness) needs adaptation, but not in the traditional sense: it is now adaptation to the local structure. We prove that composition models generate only two types of local structures: the local singleindex model and the local model with roughness isolated to a single dimension (i.e., a model containing elements of both additive and singleindex structure). We also find the zones of (γ, β) where no local structure is generated, as well as the zones where the composition modeling leads to faster rates, as compared to the classical nonparametric rates that depend only to the overall smoothness of g. 1. Introduction. In