Results 1  10
of
54
From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
, 2007
"... A fullrank matrix A ∈ IR n×m with n < m generates an underdetermined system of linear equations Ax = b having infinitely many solutions. Suppose we seek the sparsest solution, i.e., the one with the fewest nonzero entries: can it ever be unique? If so, when? As optimization of sparsity is combin ..."
Abstract

Cited by 423 (37 self)
 Add to MetaCart
(Show Context)
A fullrank matrix A ∈ IR n×m with n < m generates an underdetermined system of linear equations Ax = b having infinitely many solutions. Suppose we seek the sparsest solution, i.e., the one with the fewest nonzero entries: can it ever be unique? If so, when? As optimization of sparsity is combinatorial in nature, are there efficient methods for finding the sparsest solution? These questions have been answered positively and constructively in recent years, exposing a wide variety of surprising phenomena; in particular, the existence of easilyverifiable conditions under which optimallysparse solutions can be found by concrete, effective computational methods. Such theoretical results inspire a bold perspective on some important practical problems in signal and image processing. Several wellknown signal and image processing problems can be cast as demanding solutions of undetermined systems of equations. Such problems have previously seemed, to many, intractable. There is considerable evidence that these problems often have sparse solutions. Hence, advances in finding sparse solutions to underdetermined systems energizes research on such signal and image processing problems – to striking effect. In this paper we review the theoretical results on sparse solutions of linear systems, empirical
Why simple shrinkage is still relevant for redundant representations
 IEEE Transactions on Information Theory
, 2006
"... Abstract—Shrinkage is a well known and appealing denoising technique, introduced originally by Donoho and Johnstone in 1994. The use of shrinkage for denoising is known to be optimal for Gaussian white noise, provided that the sparsity on the signal’s representation is enforced using a unitary trans ..."
Abstract

Cited by 122 (12 self)
 Add to MetaCart
(Show Context)
Abstract—Shrinkage is a well known and appealing denoising technique, introduced originally by Donoho and Johnstone in 1994. The use of shrinkage for denoising is known to be optimal for Gaussian white noise, provided that the sparsity on the signal’s representation is enforced using a unitary transform. Still, shrinkage is also practiced with nonunitary, and even redundant representations, typically leading to very satisfactory results. In this correspondence we shed some light on this behavior. The main argument in this work is that such simple shrinkage could be interpreted as the first iteration of an algorithm that solves the basis pursuit denoising (BPDN) problem. While the desired solution of BPDN is hard to obtain in general, we develop a simple iterative procedure for the BPDN minimization that amounts to stepwise shrinkage. We demonstrate how the simple shrinkage emerges as the first iteration of this novel algorithm. Furthermore, we show how shrinkage can be iterated, turning into an effective algorithm that minimizes the BPDN via simple shrinkage steps, in order to further strengthen the denoising effect. Index Terms—Basis pursuit, denoising, frame, overcomplete, redundant, sparse representation, shrinkage, thresholding.
Coordinate and subspace optimization methods for linear least squares with nonquadratic regularization
, 2007
"... ..."
(Show Context)
Image denoising via learned dictionaries and sparse representation
 In CVPR
, 2006
"... We address the image denoising problem, where zeromean white and homogeneous Gaussian additive noise should be removed from a given image. The approach taken is based on sparse and redundant representations over a trained dictionary. The proposed algorithm denoises the image, while simultaneously tr ..."
Abstract

Cited by 70 (7 self)
 Add to MetaCart
(Show Context)
We address the image denoising problem, where zeromean white and homogeneous Gaussian additive noise should be removed from a given image. The approach taken is based on sparse and redundant representations over a trained dictionary. The proposed algorithm denoises the image, while simultaneously trainining a dictionary on its (corrupted) content using the KSVD algorithm. As the dictionary training algorithm is limited in handling small image patches, we extend its deployment to arbitrary image sizes by defining a global image prior that forces sparsity over patches in every location in the image. We show how such Bayesian treatment leads to a simple and effective denoising algorithm, with stateoftheart performance, equivalent and sometimes surpassing recently published leading alternative denoising methods. 1.
Multiscale Poisson data smoothing
 J. Roy. Stat. Soc. B
, 2006
"... This paper introduces a framework for nonlinear, multiscale decompositions of Poisson data with piecewise smooth intensity curves. The key concept is conditioning on the sum of the observations that are involved in the computation of a given coefficient. Within this framework, most classical wavelet ..."
Abstract

Cited by 31 (1 self)
 Add to MetaCart
(Show Context)
This paper introduces a framework for nonlinear, multiscale decompositions of Poisson data with piecewise smooth intensity curves. The key concept is conditioning on the sum of the observations that are involved in the computation of a given coefficient. Within this framework, most classical wavelet thresholding schemes for data with additive, homoscedastic noise apply. Any family of wavelet transforms (orthogonal, biorthogonal, second generation) can be incorporated into this framework. The second contribution is a Bayesian shrinkage with an original prior for coefficients of this decomposition. As such, the method combines the advantages of the Fiszwavelet transform and (Bayesian) Multiscale Likelihood models, with additional benefits, such as the extendibility towards arbitrary wavelet families.
Image Denoising using Wavelet Thresholding
 INDIAN CONFERENCE ON COMPUTER VISION, GRAPHICS AND IMAGE PROCESSING, AHMEDABAD
, 2002
"... This paper proposes an adaptive threshold estimation method for image denoising in the wavelet domain based on the generalized Guassian distribution (GGD) modeling of subband coefficients. The proposed method called NormalShrink is computationally more efficient and adaptive because the parameters r ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
This paper proposes an adaptive threshold estimation method for image denoising in the wavelet domain based on the generalized Guassian distribution (GGD) modeling of subband coefficients. The proposed method called NormalShrink is computationally more efficient and adaptive because the parameters required for estimating the threshold depend on subband data. The threshold is computed by / y where and y are the standard deviation of the noise and the subband data of noisy image respectively. is the scale parameter, which depends upon the subband size and number of decompositions. Experimental results on several test images are compared with various denoising techniques like Wiener Filtering [2], BayesShrink [3] and SureShrink [4]. To benchmark against the best possible performance of a threshold estimate, the comparison also include Oracleshrink. Experimental results show that the proposed threshold removes noise significantly and remains within 4% of OracleShrink and outperforms SureShrink, BayesShrink and Wiener filtering most of the time.
Time series knowledge mining
, 2006
"... An important goal of knowledge discovery is the search for patterns in data that can help explain the underlying process that generated the data. The patterns are required to be new, useful, and understandable to humans. In this work we present a new method for the understandable description of loca ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
An important goal of knowledge discovery is the search for patterns in data that can help explain the underlying process that generated the data. The patterns are required to be new, useful, and understandable to humans. In this work we present a new method for the understandable description of local temporal relationships in multivariate data, called Time Series Knowledge Mining (TSKM). We define the Time Series Knowledge Representation (TSKR) as a new language for expressing temporal knowledge. The patterns have a hierarchical structure, each level corresponds to a single temporal concept. On the lowest level, intervals are used to represent duration. Overlapping parts of intervals represent coincidence on the next level. Several such blocks of intervals are connected with a partial order relation on the highest level. Each pattern element consists of a semiotic triple to connect syntactic and semantic information with pragmatics. The patterns are very compact, but offer details for each element on demand. In comparison with related approaches, the TSKR is shown to have advantages in robustness, expressivity, and comprehensibility. Efficient algorithms for the discovery of the patterns are proposed. The search for coincidence as well as partial order can be formulated as variants of the well known frequent itemset problem. One of the best known algorithms for this problem is therefore adapted for our purposes. Human interaction is used during the mining to analyze and validate partial results as early as possible and guide further processing steps. The efficacy of the methods is demonstrated using several data sets. In an application to sports medicine the results were recognized as valid and useful by an expert of the field.
Undecimated Wavelet Transforms for Image DeNoising
, 2002
"... A few different approaches exist for computing undecimated wavelet transform. In this work we construct three undecimated schemes and evaluate their performance for image noise reduction. We use standard wavelet based denoising techniques and compare the performance of our algorithms with the ori ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
A few different approaches exist for computing undecimated wavelet transform. In this work we construct three undecimated schemes and evaluate their performance for image noise reduction. We use standard wavelet based denoising techniques and compare the performance of our algorithms with the original undecimated wavelet transform, as well as with the decimated wavelet transform. The experiments we have made show that our algorithms have better noise removal/blurring ratio.
Optimised Orthogonal Matching Pursuit Approach
 IEEE Signal Processing Letters, Vol 9
, 2002
"... A recursive approach for shrinking coefficients of an atomic decomposition is proposed. The corresponding algorithm evolves so as to provide at each iteration a) the orthogonal projection of a signal onto a reduced subspace and b) the index of the coefficient to be disregarded in order to construct ..."
Abstract

Cited by 15 (9 self)
 Add to MetaCart
(Show Context)
A recursive approach for shrinking coefficients of an atomic decomposition is proposed. The corresponding algorithm evolves so as to provide at each iteration a) the orthogonal projection of a signal onto a reduced subspace and b) the index of the coefficient to be disregarded in order to construct a coarser approximation minimizing the norm of the residual error. EDICS Category: 1TFSR. 1
Adaptive wavelet restoration of noisy video sequences,” presented at the
 Proc. Int. Conf. Image Processing
, 2004
"... In this paper, we present a novel algorithm for restoration of noisy video sequences. A video sequence is first transformed into an optimal 3D wavelet domain using basis functions adapted to the contents of the sequence. Assuming that all the major spatiotemporal frequency phenomena present in the s ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
(Show Context)
In this paper, we present a novel algorithm for restoration of noisy video sequences. A video sequence is first transformed into an optimal 3D wavelet domain using basis functions adapted to the contents of the sequence. Assuming that all the major spatiotemporal frequency phenomena present in the sequence produce high amplitude transform coefficients, a modified form of the BayesShrink thresholding method is used to suppress the noise. In order to reduce the effects of Gibbs phenomenon in the restored sequence, translation dependence is removed by averaging the restored instances of the shifted sequence. The algorithm yields promising results in terms of both objective and subjective quality of the restored sequence. 1.