Results 1  10
of
40
DeNoising By SoftThresholding
, 1992
"... Donoho and Johnstone (1992a) proposed a method for reconstructing an unknown function f on [0; 1] from noisy data di = f(ti)+ zi, iid i =0;:::;n 1, ti = i=n, zi N(0; 1). The reconstruction fn ^ is de ned in the wavelet domain by translating all the empirical wavelet coe cients of d towards 0 by an a ..."
Abstract

Cited by 1249 (14 self)
 Add to MetaCart
Donoho and Johnstone (1992a) proposed a method for reconstructing an unknown function f on [0; 1] from noisy data di = f(ti)+ zi, iid i =0;:::;n 1, ti = i=n, zi N(0; 1). The reconstruction fn ^ is de ned in the wavelet domain by translating all the empirical wavelet coe cients of d towards 0 by an amount p 2 log(n) = p n. We prove two results about that estimator. [Smooth]: With high probability ^ fn is at least as smooth as f, in any of a wide variety of smoothness measures. [Adapt]: The estimator comes nearly as close in mean square to f as any measurable estimator can come, uniformly over balls in each of two broad scales of smoothness classes. These two properties are unprecedented in several ways. Our proof of these results develops new facts about abstract statistical inference and its connection with an optimal recovery model.
Wavelet shrinkage: asymptopia
 Journal of the Royal Statistical Society, Ser. B
, 1995
"... Considerable e ort has been directed recently to develop asymptotically minimax methods in problems of recovering in nitedimensional objects (curves, densities, spectral densities, images) from noisy data. A rich and complex body of work has evolved, with nearly or exactly minimax estimators bein ..."
Abstract

Cited by 297 (36 self)
 Add to MetaCart
Considerable e ort has been directed recently to develop asymptotically minimax methods in problems of recovering in nitedimensional objects (curves, densities, spectral densities, images) from noisy data. A rich and complex body of work has evolved, with nearly or exactly minimax estimators being obtained for a variety of interesting problems. Unfortunately, the results have often not been translated into practice, for a variety of reasons { sometimes, similarity to known methods, sometimes, computational intractability, and sometimes, lack of spatial adaptivity. We discuss a method for curve estimation based on n noisy data; one translates the empirical wavelet coe cients towards the origin by an amount p p 2 log(n) = n. The method is di erent from methods in common use today, is computationally practical, and is spatially adaptive; thus it avoids a number of previous objections to minimax estimators. At the same time, the method is nearly minimax for a wide variety of loss functions { e.g. pointwise error, global error measured in L p norms, pointwise and global error in estimation of derivatives { and for a wide range of smoothness classes, including standard Holder classes, Sobolev classes, and Bounded Variation. This is amuch broader nearoptimality than anything previously proposed in the minimax literature. Finally, the theory underlying the method is interesting, as it exploits a correspondence between statistical questions and questions of optimal recovery and informationbased complexity.
Unconditional bases are optimal bases for data compression and for statistical estimation
 Applied and Computational Harmonic Analysis
, 1993
"... An orthogonal basis of L 2 which is also an unconditional basis of a functional space F is a kind of optimal basis for compressing, estimating, and recovering functions in F. Simple thresholding operations, applied in the unconditional basis, work essentially better for compressing, estimating, and ..."
Abstract

Cited by 172 (21 self)
 Add to MetaCart
(Show Context)
An orthogonal basis of L 2 which is also an unconditional basis of a functional space F is a kind of optimal basis for compressing, estimating, and recovering functions in F. Simple thresholding operations, applied in the unconditional basis, work essentially better for compressing, estimating, and recovering than they do in any other orthogonal basis. In fact, simple thresholding in an unconditional basis works essentially better for recovery and estimation than other methods, period. (Performance is measured in an asymptotic minimax sense.) As an application, we formalize and prove Mallat's Heuristic, which says that wavelet bases are optimal for representing functions containing singularities, when there may be an arbitrary number of singularities, arbitrarily distributed.
Recovering Edges in IllPosed Inverse Problems: Optimality of Curvelet Frames
, 2000
"... We consider a model problem of recovering a function f(x1,x2) from noisy Radon data. The function f to be recovered is assumed smooth apart from a discontinuity along a C2 curve – i.e. an edge. We use the continuum white noise model, with noise level ɛ. Traditional linear methods for solving such in ..."
Abstract

Cited by 78 (14 self)
 Add to MetaCart
We consider a model problem of recovering a function f(x1,x2) from noisy Radon data. The function f to be recovered is assumed smooth apart from a discontinuity along a C2 curve – i.e. an edge. We use the continuum white noise model, with noise level ɛ. Traditional linear methods for solving such inverse problems behave poorly in the presence of edges. Qualitatively, the reconstructions are blurred near the edges; quantitatively, they give in our model Mean Squared Errors (MSEs) that tend to zero with noise level ɛ only as O(ɛ1/2)asɛ → 0. A recent innovation – nonlinear shrinkage in the wavelet domain – visually improves edge sharpness and improves MSE convergence to O(ɛ2/3). However, as we show here, this rate is not optimal. In fact, essentially optimal performance is obtained by deploying the recentlyintroduced tight frames of curvelets in this setting. Curvelets are smooth, highly anisotropic elements ideally suited for detecting and synthesizing curved edges. To deploy them in the Radon setting, we construct a curveletbased biorthogonal decomposition
Asymptotically Exact Nonparametric Hypothesis Testing in SupNorm and At a Fixed Point
, 1997
"... For the signal in Gaussian white noise model we consider the problem of testing the hypothesis H 0 : f j 0; (the signal f is zero) against the nonparametric alternative H 1 : f 2 " where " is a set of functions on R 1 of the form " = ff : f 2 F ; '(f) C/ " g: Here F is ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
For the signal in Gaussian white noise model we consider the problem of testing the hypothesis H 0 : f j 0; (the signal f is zero) against the nonparametric alternative H 1 : f 2 " where " is a set of functions on R 1 of the form " = ff : f 2 F ; '(f) C/ " g: Here F is a Holder or Sobolev class of functions, '(f) is either the supnorm of f or the value of f at a fixed point, C ? 0 is a constant, / " is the minimax rate of testing and " ! 0 is the asymptotic parameter of the model. We find exact separation constants C ? 0 such that a test with the given summarized asymptotic errors of first and second type is possible for C ? C and is not possible for C ! C . We propose asymptotically minimax test statistics. 1 Introduction Consider the stochastic process Y (t) defined on [0; 1] and satisfying the stochastic differential equation dY (t) = f(t)dt + "dW (t); (1) where W (t) is the standard Wiener process on [0; 1], f is an unknown realvalued function and 0 ! " ! 1. Sup...
Asymptotic minimaxity of wavelet estimators with sampled data’, Statist
 Sinica
, 1999
"... Donoho and Johnstone (1997) studied a setting where data were obtained in the continuum white noise model and showed that scalar nonlinearities applied to wavelet coefficients gave estimators which were asymptotically minimax over Besov balls. They claimed that this implied similar asymptotic minima ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
Donoho and Johnstone (1997) studied a setting where data were obtained in the continuum white noise model and showed that scalar nonlinearities applied to wavelet coefficients gave estimators which were asymptotically minimax over Besov balls. They claimed that this implied similar asymptotic minimaxity results in the sampleddata model. In this paper we carefully develop and fully prove this implication. Our results are based on a careful definition of an empirical wavelet transform and precise bounds on the discrepancy between empirical wavelet coefficiets and the theoretical wavelet coefficients.
The asymptotic minimax constant for supnorm loss in nonparametric density estimation
 Bernoulli
, 1999
"... We develop the exact constant of the risk asymptotics in the uniform norm for density estimation. This constant has first been found for nonparametric regression and for signal estimation in Gaussian white noise. Hölder classes for arbitrary smoothness index β> 0 on the unit interval are conside ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
We develop the exact constant of the risk asymptotics in the uniform norm for density estimation. This constant has first been found for nonparametric regression and for signal estimation in Gaussian white noise. Hölder classes for arbitrary smoothness index β> 0 on the unit interval are considered. The constant involves the value of an optimal recovery problem as in the white noise case, but in addition it depends on the maximum of densities in the function class.
Inverse Problems as Statistics
 INVERSE PROBLEMS
, 2002
"... What mathematicians, scientists, engineers, and statisticians mean by "inverse problem" differs. For a statistician, an inverse problem is an inference or estimation problem... ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
(Show Context)
What mathematicians, scientists, engineers, and statisticians mean by "inverse problem" differs. For a statistician, an inverse problem is an inference or estimation problem...
Asymptotic equivalence of estimating a poisson intensity and a positive diffusion drift, The Annals of Statistics
"... We consider a diffusion model of small variance type with positive drift density varying in a nonparametric set. We investigate Gaussian and Poisson approximations to this model in the sense of asymptotic equivalence of experiments. It is shown that observation of the diffusion process until its fir ..."
Abstract

Cited by 16 (1 self)
 Add to MetaCart
We consider a diffusion model of small variance type with positive drift density varying in a nonparametric set. We investigate Gaussian and Poisson approximations to this model in the sense of asymptotic equivalence of experiments. It is shown that observation of the diffusion process until its first hitting time of level one is a natural model for the purpose of inference on the drift density. The diffusion model can be discretized by the collection of level crossing times for a uniform grid of levels. The random time increments are asymptotically sufficient and obey a nonparametric regression model with independent data. This decoupling is then used to establish asymptotic equivalence to Gaussian signalinwhitenoise and Poisson intensity models on the unit interval, and also to an i.i.d. model when the diffusion drift function f is a probability density. As an application, we find the exact asymptotic minimax constant for estimating the diffusion drift density with supnorm loss. 1. Introduction. Diffusion
Consistency of Bayes estimates for nonparametric regression: normal theory
 BERNOULLI
, 1998
"... ..."