Results 1  10
of
145
Wavelet estimators in nonparametric regression: a comparative simulation study
 Journal of Statistical Software
, 2001
"... OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible. ..."
Abstract

Cited by 113 (18 self)
 Add to MetaCart
OATAO is an open access repository that collects the work of Toulouse researchers and makes it freely available over the web where possible.
Incorporating Information on Neighboring Coefficients into Wavelet Estimation
, 1999
"... In standard wavelet methods, the empirical wavelet coefficients are thresholded term by term, on the basis of their individual magnitudes. Information on other coefficients has no influence on the treatment of particular coefficients. We propose a wavelet shrinkage method that incorporates informati ..."
Abstract

Cited by 77 (11 self)
 Add to MetaCart
In standard wavelet methods, the empirical wavelet coefficients are thresholded term by term, on the basis of their individual magnitudes. Information on other coefficients has no influence on the treatment of particular coefficients. We propose a wavelet shrinkage method that incorporates information on neighboring coefficients into the decision making. The coefficients are considered in overlapping blocks; the treatment of coefficients in the middle of each block depends on the data in the whole block. The asymptotic and numerical performances of two particular versions of the estimator are investigated. We show that, asymptotically, one version of the estimator achieves the exact optimal rates of convergence over a range of Besov classes for global estimation, and attains adaptive minimax rate for estimating functions at a point. In numerical comparisons with various methods, both versions of the estimator perform excellently.
Wavelet Analysis and Its Statistical Applications
, 1999
"... In recent years there has been a considerable development in the use of wavelet methods in statistics. As a result, we are now at the stage where it is reasonable to consider such methods to be another standard tool of the applied statistician rather than a research novelty. With that in mind, this ..."
Abstract

Cited by 61 (13 self)
 Add to MetaCart
In recent years there has been a considerable development in the use of wavelet methods in statistics. As a result, we are now at the stage where it is reasonable to consider such methods to be another standard tool of the applied statistician rather than a research novelty. With that in mind, this article is intended to give a relatively accessible introduction to standard wavelet analysis and to provide an up to date review of some common uses of wavelet methods in statistical applications. It is primarily orientated towards the general statistical audience who may be involved in analysing data where the use of wavelets might be e ective, rather than to researchers already familiar with the eld. Given that objective, we do not emphasise mathematical generality or rigour in our exposition of wavelets and we restrict our discussion to the more frequently employed wavelet methods in statistics. We provide extensive references where the ideas and concepts discussed can be followed up in...
Modern statistical estimation via oracle inequalities
, 2006
"... A number of fundamental results in modern statistical theory involve thresholding estimators. This survey paper aims at reconstructing the history of how thresholding rules came to be popular in statistics and describing, in a not overly technical way, the domain of their application. Two notions pl ..."
Abstract

Cited by 41 (0 self)
 Add to MetaCart
A number of fundamental results in modern statistical theory involve thresholding estimators. This survey paper aims at reconstructing the history of how thresholding rules came to be popular in statistics and describing, in a not overly technical way, the domain of their application. Two notions play a fundamental role in our narrative: sparsity and oracle inequalities. Sparsity is a property of the object to estimate, which seems to be characteristic of many modern problems, in statistics as well as applied mathematics and theoretical computer science, to name a few. ‘Oracle inequalities’ are a powerful decisiontheoretic tool which has served to understand the optimality of thresholding rules, but which has many other potential applications, some of which we will discuss. Our story is also the story of the dialogue between statistics and applied harmonic analysis. Starting with the work of Wiener, we will see that certain representations emerge as being optimal for estimation. A leitmotif throughout
Accurate Prediction of Phase Transitions in Compressed Sensing via a Connection to Minimax Denoising
, 1111
"... Compressed sensing posits that, within limits, one can undersample a sparse signal and yet reconstruct it accurately. Knowing the precise limits to such undersampling is important both for theory and practice. We present a formula that characterizes the allowed undersampling of generalized sparse ob ..."
Abstract

Cited by 40 (4 self)
 Add to MetaCart
(Show Context)
Compressed sensing posits that, within limits, one can undersample a sparse signal and yet reconstruct it accurately. Knowing the precise limits to such undersampling is important both for theory and practice. We present a formula that characterizes the allowed undersampling of generalized sparse objects. The formula applies to Approximate Message Passing (AMP) algorithms for compressed sensing, which are here generalized to employ denoising operators besides the traditional scalar soft thresholding denoiser. This paper gives several examples including scalar denoisers not derived from convex penalization – the firm shrinkage nonlinearity and the minimax nonlinearity – and also nonscalar denoisers – block thresholding, monotone regression, and total variation minimization. Let the variables ε = k/N and δ = n/N denote the generalized sparsity and undersampling fractions for sampling the kgeneralizedsparse Nvector x0 according to y = Ax0. Here A is an n × N measurement matrix whose entries are iid standard Gaussian. The formula states that the phase transition curve δ = δ(ε) separating successful from unsuccessful reconstruction of x0
On block thresholding in wavelet regression: adaptivity, block size and threshold level. Statist. Sinica, 12(4):1241{1273. ha l0 1, v er sio n  2 No v Regression with Random Design and Wavelet Block Thresholding 17
, 2002
"... Abstract: In this article we investigate the asymptotic and numerical properties of a class of block thresholding estimators for wavelet regression. We consider the effect of block size on global and local adaptivity and the ch oice of thresholding constant. The optimal rate of convergence for block ..."
Abstract

Cited by 40 (8 self)
 Add to MetaCart
Abstract: In this article we investigate the asymptotic and numerical properties of a class of block thresholding estimators for wavelet regression. We consider the effect of block size on global and local adaptivity and the ch oice of thresholding constant. The optimal rate of convergence for block thresholding with a given block size is derived for both the global and local estimation. It is shown that there are conflicting requirements on the block size for achieving the global and local adaptivity. We then consider the choice of thresholding constant for a given block size by treating the block thresholding as a hypothesis testing problem. The combined results lead naturally to an optimal choice of block size and thresholding constant. We conclude with a numerical study which compares the finitesample performance among block thresholding estimators as well as with other wavelet methods. Key words and phrases: Block thresholding, convergence rate, global adaptivity, local adaptivity, minimax estimation, nonparametric regression, smoothing parameter, wavelets. 1.
A survey on wavelet applications in data mining
 SIGKDD Explor. Newsl
"... Recently there has been significant development in the use of wavelet methods in various data mining processes. However, there has been written no comprehensive survey available on the topic. The goal of this is paper to fill the void. First, the paper presents a highlevel datamining framework tha ..."
Abstract

Cited by 37 (4 self)
 Add to MetaCart
(Show Context)
Recently there has been significant development in the use of wavelet methods in various data mining processes. However, there has been written no comprehensive survey available on the topic. The goal of this is paper to fill the void. First, the paper presents a highlevel datamining framework that reduces the overall process into smaller components. Then applications of wavelets for each component are reviewd. The paper concludes by discussing the impact of wavelets on data mining research and outlining potential future research directions and applications. 1.
Audio Denoising by Timefrequency Block Thresholding
, 2007
"... Removing noise from audio signals requires a nondiagonal processing of timefrequency coefficients to avoid producing “musical noise”. State of the art algorithms perform a parameterized filtering of spectrogram coefficients with empirically fixed parameters. A block thresholding estimation procedu ..."
Abstract

Cited by 35 (5 self)
 Add to MetaCart
(Show Context)
Removing noise from audio signals requires a nondiagonal processing of timefrequency coefficients to avoid producing “musical noise”. State of the art algorithms perform a parameterized filtering of spectrogram coefficients with empirically fixed parameters. A block thresholding estimation procedure is introduced, which adjusts all parameters adaptively to signal property by minimizing a Stein estimation of the risk. Numerical experiments demonstrate the performance and robustness of this procedure through objective and subjective evaluations.
Convex and network flow optimization for structured sparsity
 JMLR
, 2011
"... We consider a class of learning problems regularized by a structured sparsityinducing norm defined as the sum of ℓ2 or ℓ∞norms over groups of variables. Whereas much effort has been put in developing fast optimization techniques when the groups are disjoint or embedded in a hierarchy, we address ..."
Abstract

Cited by 35 (9 self)
 Add to MetaCart
We consider a class of learning problems regularized by a structured sparsityinducing norm defined as the sum of ℓ2 or ℓ∞norms over groups of variables. Whereas much effort has been put in developing fast optimization techniques when the groups are disjoint or embedded in a hierarchy, we address here the case of general overlapping groups. To this end, we present two different strategies: On the one hand, we show that the proximal operator associated with a sum of ℓ∞norms can be computed exactly in polynomial time by solving a quadratic mincost flow problem, allowing the use of accelerated proximal gradient methods. On the other hand, we use proximal splitting techniques, and address an equivalent formulation with nonoverlapping groups, but in higher dimension and with additional constraints. We propose efficient and scalable algorithms exploiting these two strategies, which are significantly faster than alternative approaches. We illustrate these methods with several problems such as CUR matrix factorization, multitask learning of treestructured dictionaries, background subtraction in video sequences, image denoising with wavelets, and topographic dictionary learning of natural image patches.
Adaptive Bayesian estimation using a Gaussian random field with inverse Gamma bandwidth. The Annals of Statistics 37
, 2009
"... We consider nonparametric Bayesian estimation inference using a rescaled smooth Gaussian field as a prior for a multidimensional function. The rescaling is achieved using a Gamma variable and the procedure can be viewed as choosing an inverse Gamma bandwidth. The procedure is studied from a frequent ..."
Abstract

Cited by 32 (2 self)
 Add to MetaCart
We consider nonparametric Bayesian estimation inference using a rescaled smooth Gaussian field as a prior for a multidimensional function. The rescaling is achieved using a Gamma variable and the procedure can be viewed as choosing an inverse Gamma bandwidth. The procedure is studied from a frequentist perspective in three statistical settings involving replicated observations (density estimation, regression and classification). We prove that the resulting posterior distribution shrinks to the distribution that generates the data at a speed which is minimaxoptimal up to a logarithmic factor, whatever the regularity level of the datagenerating distribution. Thus the hierachical Bayesian procedure, with a fixed prior, is shown to be fully adaptive. 1. Introduction. The