Results 1  10
of
13
Analysis KSVD: A DictionaryLearning Algorithm for the Analysis Sparse Model
, 2012
"... The synthesisbased sparse representation model for signals has drawn considerable interest in the past decade. Such a model assumes that the signal of interest can be decomposed as a linear combination of a few atoms from a given dictionary. In this paper we concentrate on an alternative, analysis ..."
Abstract

Cited by 21 (6 self)
 Add to MetaCart
The synthesisbased sparse representation model for signals has drawn considerable interest in the past decade. Such a model assumes that the signal of interest can be decomposed as a linear combination of a few atoms from a given dictionary. In this paper we concentrate on an alternative, analysisbased model, where an analysis operator – hereafter referred to as the analysis dictionary – multiplies the signal, leading to a sparse outcome. Our goal is to learn the analysis dictionary from a set of examples. The approach taken is parallel and similar to the one adopted by the KSVD algorithm that serves the corresponding problem in the synthesis model. We present the development of the algorithm steps: This includes tailored pursuit algorithms – the Backward Greedy and the Optimized Backward Greedy algorithms, and a penalty function that defines the objective for the dictionary update stage. We demonstrate the effectiveness of the proposed dictionary learning in several experiments, treating synthetic data and real images, and showing a successful and meaningful recovery of the analysis dictionary.
Smoothing and Decomposition for Analysis Sparse Recovery
, 2014
"... We consider algorithms and recovery guarantees for the analysis sparse model in which the signal is sparse with respect to a highly coherent frame. We consider the use of a monotone version of the fast iterative shrinkagethresholding algorithm (MFISTA) to solve the analysis sparse recovery problem ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
We consider algorithms and recovery guarantees for the analysis sparse model in which the signal is sparse with respect to a highly coherent frame. We consider the use of a monotone version of the fast iterative shrinkagethresholding algorithm (MFISTA) to solve the analysis sparse recovery problem. Since the proximal operator in MFISTA does not have a closedform solution for the analysis model, it cannot be applied directly. Instead, we examine two alternatives based on smoothing and decomposition transformations that relax the original sparse recovery problem, and then implement MFISTA on the relaxed formulation. We refer to these two methods as smoothingbased and decompositionbased MFISTA. We analyze the convergence of both algorithms and establish that smoothingbased MFISTA converges more rapidly when applied to general nonsmooth optimization problems. We then derive a performance bound on the reconstruction error using these techniques. The bound proves that our methods can recover a signal sparse in a redundant tight frame when the measurement matrix satisfies a properly adapted restricted isometry property. Numerical examples demonstrate the performance of our methods and show that smoothingbased MFISTA converges faster than the decompositionbased alternative in real applications, such as MRI image reconstruction.
On MAP and MMSE Estimators for the Cosparse Analysis ModelI
"... The sparse synthesis model for signals has become very popular in the last decade, leading to improved performance in many signal processing applications. This model assumes that a signal may be described as a linear combination of few columns (atoms) of a given synthesis matrix (dictionary). The ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
The sparse synthesis model for signals has become very popular in the last decade, leading to improved performance in many signal processing applications. This model assumes that a signal may be described as a linear combination of few columns (atoms) of a given synthesis matrix (dictionary). The CoSparse Analysis model is a recently introduced counterpart, whereby signals are assumed to be orthogonal to many rows of a given analysis dictionary. These rows are called the cosupport. The Analysis model has already led to a series of contributions that address the pursuit problem: identifying the cosupport of a corrupted signal in order to restore it. While all the existing work adopts a deterministic point of view towards the design of such pursuit algorithms, this paper introduces a Bayesian estimation point of view, starting with a random generative model for the cosparse analysis signals. This is followed by a derivation of Oracle, MinimumMeanSquaredError (MMSE), and MaximumA’posterioriProbability (MAP) based estimators. We present a comparison between the deterministic formulations and these estimators, drawing some connections between the two. We develop practical approximations to the MAP and MMSE estimators, and demonstrate the proposed reconstruction algorithms in several synthetic and real image experiments, showing their potential and applicability.
OMP with Highly Coherent Dictionaries
"... Abstract—Recovering signals that has a sparse representation from a given set of linear measurements has been a major topic of research in recent years. Most of the work dealing with this subject focus on the reconstruction of the signal’s representation as the means to recover the signal itself. Th ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Recovering signals that has a sparse representation from a given set of linear measurements has been a major topic of research in recent years. Most of the work dealing with this subject focus on the reconstruction of the signal’s representation as the means to recover the signal itself. This approach forces the dictionary to be of lowcoherence and with no linear dependencies between its columns. Recently, a series of contributions show that such dependencies can be allowed by aiming at recovering the signal itself. However, most of these recent works consider the analysis framework, and only few discuss the synthesis model. This paper studies the synthesis and introduces a new mutual coherence definition for signal recovery, showing that a modified version of OMP can recover sparsely represented signals of a dictionary with very high correlations between pairs of columns. We show how the derived results apply to the plain OMP. I.
CAN WE ALLOW LINEAR DEPENDENCIES IN THE DICTIONARY IN THE SPARSE SYNTHESIS FRAMEWORK?
"... Signal recovery from a given set of linear measurements using a sparsity prior has been a major subject of research in recent years. In this model, the signal is assumed to have a sparse representation under a given dictionary. Most of the work dealing with this subject has focused on the reconstruc ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Signal recovery from a given set of linear measurements using a sparsity prior has been a major subject of research in recent years. In this model, the signal is assumed to have a sparse representation under a given dictionary. Most of the work dealing with this subject has focused on the reconstruction of the signal’s representation as the means for recovering the signal itself. This approach forced the dictionary to be of low coherence and with no linear dependencies between its columns. Recently, a series of contributions that focus on signal recovery using the analysis model find that linear dependencies in the analysis dictionary are in fact permitted and beneficial. In this paper we show theoretically that the same holds also for signal recovery in the synthesis case for the ℓ0synthesis minimization problem. In addition, we demonstrate empirically the relevance of our conclusions for recovering the signal using an ℓ1relaxation. Index Terms — Sparse representations, compressed sensing, analysis versus synthesis, inverse problems.
GreedyLike Algorithms for the Cosparse Analysis Model
"... The cosparse analysis model has been introduced recently as an interesting alternative to the standard sparse synthesis approach. A prominent question brought up by this new construction is the analysis pursuit problem – the need to find a signal belonging to this model, given a set of corrupted mea ..."
Abstract
 Add to MetaCart
(Show Context)
The cosparse analysis model has been introduced recently as an interesting alternative to the standard sparse synthesis approach. A prominent question brought up by this new construction is the analysis pursuit problem – the need to find a signal belonging to this model, given a set of corrupted measurements of it. Several pursuit methods have already been proposed based on ℓ1 relaxation and a greedy approach. In this work we pursue this question further, and propose a new family of pursuit algorithms for the cosparse analysis model, mimicking the greedylike methods – compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP), iterative hard thresholding (IHT) and hard thresholding pursuit (HTP). Assuming the availability of a near optimal projection scheme that finds the nearest cosparse subspace to any vector, we provide performance guarantees for these algorithms. Our theoretical study relies on a restricted isometry property adapted to the context of the cosparse analysis model. We explore empirically the performance of these algorithms by adopting a plain thresholding projection, demonstrating their good performance.
1 Robust Sparse Analysis Regularization
, 2012
"... Abstract—This paper investigates the theoretical guarantees of ℓ 1analysis regularization when solving linear inverse problems. Most of previous works in the literature have mainly focused on the sparse synthesis prior where the sparsity is measured as the ℓ 1 norm of the coefficients that synthesi ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—This paper investigates the theoretical guarantees of ℓ 1analysis regularization when solving linear inverse problems. Most of previous works in the literature have mainly focused on the sparse synthesis prior where the sparsity is measured as the ℓ 1 norm of the coefficients that synthesize the signal from a given dictionary. In contrast, the more general analysis regularization minimizes the ℓ 1 norm of the correlations between the signal and the atoms in the dictionary, where these correlations define the analysis support. The corresponding variational problem encompasses several wellknown regularizations such as the discrete total variation and the Fused Lasso. Our main contributions consist in deriving sufficient conditions that guarantee exact or partial analysis support recovery of the true signal in presence of noise. More precisely, we give a sufficient condition to ensure that a signal is the unique solution of the ℓ 1analysis regularization in the noiseless case. The same condition also guarantees exact analysis support recovery and ℓ 2robustness of the ℓ 1analysis minimizer visàvis an enough small noise in the measurements. This condition turns to be sharp for the robustness of the analysis support. To show partial support recovery and ℓ 2robustness to an arbitrary bounded noise, we introduce a stronger sufficient condition. When specialized to the ℓ 1synthesis regularization, our results recover some corresponding recovery and robustness guarantees previously known in the literature. From this perspective, our work is a generalization of these results. We finally illustrate these theoretical findings on several examples to study the robustness of the 1D total variation and Fused Lasso regularizations. Index Terms—sparsity, analysis regularization, synthesis regularization, inverse problems, ℓ 1 minimization, union of subspaces, noise robustness, total variation, wavelets, Fused Lasso.