Results 1  10
of
176
Translationinvariant denoising
, 1995
"... DeNoising with the traditional (orthogonal, maximallydecimated) wavelet transform sometimes exhibits visual artifacts; we attribute some of these – for example, Gibbs phenomena in the neighborhood of discontinuities – to the lack of translation invariance of the wavelet basis. One method to suppre ..."
Abstract

Cited by 307 (7 self)
 Add to MetaCart
(Show Context)
DeNoising with the traditional (orthogonal, maximallydecimated) wavelet transform sometimes exhibits visual artifacts; we attribute some of these – for example, Gibbs phenomena in the neighborhood of discontinuities – to the lack of translation invariance of the wavelet basis. One method to suppress such artifacts, termed “cycle spinning ” by Coifman, is to “average out ” the translation dependence. For a range of shifts, one shifts the data (right or left as the case may be), DeNoises the shifted data, and then unshifts the denoised data. Doing this for each of a range of shifts, and averaging the several results so obtained, produces a reconstruction subject to far weaker Gibbs phenomena than thresholding based DeNoising using the traditional orthogonal wavelet transform. CycleSpinning over the range of all circulant shifts can be accomplished in order nlog 2(n) time; it is equivalent to denoising using the undecimated or stationary wavelet transform. Cyclespinning exhibits benefits outside of wavelet denoising, for example in cosine packet denoising, where it helps suppress ‘clicks’. It also has a counterpart in frequency domain denoising, where the goal of translationinvariance is replaced by modulation invariance, and the central shiftDeNoiseunshift operation is replaced by modulateDeNoisedemodulate. We illustrate these concepts with extensive computational examples; all figures presented here are reproducible using the WaveLab software package. 1
Noise Reduction Using an Undecimated Discrete Wavelet Transform
, 1995
"... A new nonlinear noise reduction method is presented that uses the discrete wavelet transform. Similar to Donoho and Johnstone, we employ thresholding in the wavelet transform domain but, following a suggestion by Coifman, we use an undecimated, shiftinvariant, nonorthogonal wavelet transform instea ..."
Abstract

Cited by 122 (7 self)
 Add to MetaCart
A new nonlinear noise reduction method is presented that uses the discrete wavelet transform. Similar to Donoho and Johnstone, we employ thresholding in the wavelet transform domain but, following a suggestion by Coifman, we use an undecimated, shiftinvariant, nonorthogonal wavelet transform instead of the usual orthogonal one. This new approach can be interpreted as a repeated application of the original Donoho and Johnstone method for different shifts. The main feature of the new algorithm is a significantly improved noise reduction compared to the original wavelet based approach, both the l 2 error and visually, for a large class of signals. This is shown both theoretically as well as by experimental results.
Dictionaries for Sparse Representation Modeling
"... Sparse and redundant representation modeling of data assumes an ability to describe signals as linear combinations of a few atoms from a prespecified dictionary. As such, the choice of the dictionary that sparsifies the signals is crucial for the success of this model. In general, the choice of a p ..."
Abstract

Cited by 109 (4 self)
 Add to MetaCart
Sparse and redundant representation modeling of data assumes an ability to describe signals as linear combinations of a few atoms from a prespecified dictionary. As such, the choice of the dictionary that sparsifies the signals is crucial for the success of this model. In general, the choice of a proper dictionary can be done using one of two ways: (i) building a sparsifying dictionary based on a mathematical model of the data, or (ii) learning a dictionary to perform best on a training set. In this paper we describe the evolution of these two paradigms. As manifestations of the first approach, we cover topics such as wavelets, wavelet packets, contourlets, and curvelets, all aiming to exploit 1D and 2D mathematical models for constructing effective dictionaries for signals and images. Dictionary learning takes a different route, attaching the dictionary to a set of examples it is supposed to serve. From the seminal work of Field and Olshausen, through the MOD, the KSVD, the Generalized PCA and others, this paper surveys the various options such training has to offer, up to the most recent contributions and structures.
Wavelet shrinkage using crossvalidation
, 1996
"... Wavelets are orthonormal basis functions with special properties that show potential in many areas of mathematics and statistics. This article concentrates on the estimation of functions and images from noisy data using wavelet shrinkage. A modified form of twofold crossvalidation is introduced to ..."
Abstract

Cited by 96 (14 self)
 Add to MetaCart
Wavelets are orthonormal basis functions with special properties that show potential in many areas of mathematics and statistics. This article concentrates on the estimation of functions and images from noisy data using wavelet shrinkage. A modified form of twofold crossvalidation is introduced to choose a threshold for wavelet shrinkage estimators operating on data sets of length a power of two. The crossvalidation algorithm is then extended to data sets of any length and to multidimensional data sets. The algorithms are compared to established threshold choosers using simulation. An application to a real data set arising from anaesthesia is presented.
LongTerm Forecasting of Internet Backbone Traffic: Observations and Initial Models
 In IEEE INFOCOM
, 2003
"... We introduce a methodology to predict when and where link additions/upgrades have to take place in an IP backbone network. Using SNMP statistics, collected continuously since 1999, we compute aggregate demand between any two adjacent PoPs and look at its evolution at time scales larger than one hour ..."
Abstract

Cited by 78 (4 self)
 Add to MetaCart
(Show Context)
We introduce a methodology to predict when and where link additions/upgrades have to take place in an IP backbone network. Using SNMP statistics, collected continuously since 1999, we compute aggregate demand between any two adjacent PoPs and look at its evolution at time scales larger than one hour. We show that IP backbone traffic exhibits visible long term trends, strong periodicities, and variability at multiple time scales.
Wavelet Processes and Adaptive Estimation of the Evolutionary Wavelet Spectrum
, 1998
"... This article defines and studies a new class of nonstationary random processes constructed from discrete nondecimated wavelets which generalizes the Cramer (Fourier) representation of stationary time series. We define an evolutionary wavelet spectrum (EWS) which quantifies how process power va ..."
Abstract

Cited by 76 (30 self)
 Add to MetaCart
This article defines and studies a new class of nonstationary random processes constructed from discrete nondecimated wavelets which generalizes the Cramer (Fourier) representation of stationary time series. We define an evolutionary wavelet spectrum (EWS) which quantifies how process power varies locally over time and scale. We show how the EWS may be rigorously estimated by a smoothed wavelet periodogram and how both these quantities may be inverted to provide an estimable timelocalized autocovariance. We illustrate our theory with a pedagogical example based on discrete nondecimated Haar wavelets and also a real medical time series example.
A HaarFisz algorithm for Poisson intensity estimation
 J. Comput. Graph. Stat
, 2004
"... This article introduces a new method for the estimation of the intensity of an inhomogeneous onedimensional Poisson process. The HaarFisz transformation transforms a vector of binned Poisson counts to approximate normality with variance one. Hence we can use any suitable Gaussian wavelet shrinkag ..."
Abstract

Cited by 74 (20 self)
 Add to MetaCart
This article introduces a new method for the estimation of the intensity of an inhomogeneous onedimensional Poisson process. The HaarFisz transformation transforms a vector of binned Poisson counts to approximate normality with variance one. Hence we can use any suitable Gaussian wavelet shrinkage method to estimate the Poisson intensity. Since the HaarFisz operator does not commute with the shift operator we can dramatically improve accuracy by always cycle spinning before the HaarFisz transform as well as optionally after. Extensive simulations show that our approach usually significantly outperformed stateoftheart competitors but was occasionally comparable. Our method is fast, simple, automatic, and easy to code. Our technique is applied to the estimation of the intensity of earthquakes in northern California. We show that our technique gives visually similar results to the current stateoftheart.
Multiscale Modeling and Estimation of Poisson Processes with Application to Photonlimited Imaging
 IEEE TRANS. ON INFO. THEORY
, 1999
"... Many important problems in engineering and science are wellmodeled by Poisson processes. In many applications it is of great interest to accurately estimate the intensities underlying observed Poisson data. In particular, this work is motivated by photonlimited imaging problems. This paper studies ..."
Abstract

Cited by 72 (10 self)
 Add to MetaCart
Many important problems in engineering and science are wellmodeled by Poisson processes. In many applications it is of great interest to accurately estimate the intensities underlying observed Poisson data. In particular, this work is motivated by photonlimited imaging problems. This paper studies a new Bayesian approach to Poisson intensity estimation based on the Haar wavelet transform. It is shown that the Haar transform provides a very natural and powerful framework for this problem. Using this framework, a novel multiscale Bayesian prior to model intensity functions is devised. The new prior leads to a simple, Bayesian intensity estimation procedure. Furthermore, we characterize the correlation behavior of the new prior and show that it has 1/f spectral characteristics. The new framework is applied to photonlimited image estimation and its potential to improve nuclear medicine imaging is examined.
Time Invariant Orthonormal Wavelet Representations
"... A simple construction of an orthonormal basis starting with a so called mother wavelet, together with an efficient implementation gained the wavelet decomposition easy acceptance and generated a great research interest in its applications. An orthonormal basis may not, however, always be a suitable ..."
Abstract

Cited by 71 (9 self)
 Add to MetaCart
A simple construction of an orthonormal basis starting with a so called mother wavelet, together with an efficient implementation gained the wavelet decomposition easy acceptance and generated a great research interest in its applications. An orthonormal basis may not, however, always be a suitable representation of a signal, particularly when time (or space) invariance is a required property. The conventional way around this problem is to use a redundant decomposition. In this paper, we address the time invariance problem for orthonormal wavelet transforms and propose an extension to wavelet packet decompositions. We show that it is possible to achieve time invariance and preserve the orthonormality. We subsequently propose an efficient approach to obtain such a decomposition. We demonstrate the importance of our method by considering some application examples in signal reconstruction and time delay estimation.
The SURELET Approach to Image Denoising
, 2007
"... We propose a new approach to image denoising, based on the imagedomain minimization of an estimate of the mean squared error—Stein’s unbiased risk estimate (SURE). Unlike most existing denoising algorithms, using the SURE makes it needless to hypothesize a statistical model for the noiseless image ..."
Abstract

Cited by 68 (18 self)
 Add to MetaCart
We propose a new approach to image denoising, based on the imagedomain minimization of an estimate of the mean squared error—Stein’s unbiased risk estimate (SURE). Unlike most existing denoising algorithms, using the SURE makes it needless to hypothesize a statistical model for the noiseless image. A key point of our approach is that, although the (nonlinear) processing is performed in a transformed domain—typically, an undecimated discrete wavelet transform, but we also address nonorthonormal transforms—this minimization is performed in the image domain. Indeed, we demonstrate that, when the transform is a “tight ” frame (an undecimated wavelet transform using orthonormal filters), separate subband minimization yields substantially worse results. In order for our approach to be viable, we add another principle, that the denoising process can be expressed as a linear combination of elementary denoising processes—linear expansion of thresholds (LET). Armed with the SURE and LET principles, we show that a denoising algorithm merely amounts to solving a linear system of equations which is obviously fast and efficient. Quite remarkably, the very competitive results obtained by performing a simple threshold (imagedomain SURE optimized) on the undecimated Haar wavelet coefficients show that the SURELET principle has a huge potential.