Results 1  10
of
20
Nearoracle performance of greedy blocksparse estimation techniques from noisy measurements
 Signal Process., 2010 [Online]. Available: http://arxiv. org/pdf/1009.0906
"... Abstract—This paper examines the ability of greedy algorithms to estimate a block sparse parameter vector from noisy measurements. In particular, block sparse versions of the orthogonal matching pursuit and thresholding algorithms are analyzed under both adversarial and Gaussian noise models. In the ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
(Show Context)
Abstract—This paper examines the ability of greedy algorithms to estimate a block sparse parameter vector from noisy measurements. In particular, block sparse versions of the orthogonal matching pursuit and thresholding algorithms are analyzed under both adversarial and Gaussian noise models. In the adversarial setting, it is shown that estimation accuracy comes within a constant factor of the noise power. Under Gaussian noise, the Cramér–Rao bound is derived, and it is shown that the greedy techniques come close to this bound at high signaltonoise ratio. The guarantees are numerically compared with the actual performance of block and nonblock algorithms, identifying situations in which block sparse techniques improve upon the scalar sparsity approach. Specifically, we show that block sparse methods are particularly successful when the atoms within each block are nearly orthogonal.
Generalized orthogonal matching pursuit
 IEEE Trans. on Signal Processing
, 2012
"... ar ..."
(Show Context)
1On the Effective Measure of Dimension in the Analysis Cosparse Model
"... Abstract—Many applications have benefited remarkably from lowdimensional models in the recent decade. The fact that many signals, though high dimensional, are intrinsically low dimensional has given the possibility to recover them stably from a relatively small number of their measurements. For exa ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Abstract—Many applications have benefited remarkably from lowdimensional models in the recent decade. The fact that many signals, though high dimensional, are intrinsically low dimensional has given the possibility to recover them stably from a relatively small number of their measurements. For example, in compressed sensing with the standard (synthesis) sparsity prior and in matrix completion, the number of measurements needed is proportional (up to a logarithmic factor) to the signal’s manifold dimension. Recently, a new natural lowdimensional signal model has been proposed: the cosparse analysis prior. In the noiseless case, it is possible to recover signals from this model, using a combinatorial search, from a number of measurements proportional to the signal’s manifold
FAST LEVEL SET ESTIMATION FROM PROJECTION MEASUREMENTS
"... Estimation of the level set of a function (i.e., regions where the function exceeds some value) is an important problem with applications in digital elevation maps, medical imaging, and astronomy. In many applications, however, the function of interest is acquired through indirect measurements, such ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Estimation of the level set of a function (i.e., regions where the function exceeds some value) is an important problem with applications in digital elevation maps, medical imaging, and astronomy. In many applications, however, the function of interest is acquired through indirect measurements, such as tomographic projections, codedaperture measurements, or pseudorandom projections associated with compressed sensing. This paper describes a new methodology and associated theoretical analysis for rapid and accurate estimation of the level set from such projection measurements. The proposed method estimates the level set from projection measurements without an intermediate function reconstruction step, thereby leading to significantly faster computation. In addition, the coherence of the projection operator and McDiarmid’s inequality are used to characterize the estimator’s performance. Index Terms — Compressed sensing, coherence, level sets, performance bounds, segmentation, thresholding
OracleOrder Recovery Performance of Greedy Pursuits With Replacement Against General Perturbations
 Signal Processing, IEEE Transactions on
, 2013
"... Applying the theory of compressive sensing in practice always takes different kinds of perturbations into consideration. In this paper, the recovery performance of greedy pursuits with replacement for sparse recovery is analyzed when both the measurement vector and the sensing matrix are contaminate ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Applying the theory of compressive sensing in practice always takes different kinds of perturbations into consideration. In this paper, the recovery performance of greedy pursuits with replacement for sparse recovery is analyzed when both the measurement vector and the sensing matrix are contaminated with additive perturbations. Specifically, greedy pursuits with replacement include three algorithms, compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP), and iterative hard thresholding (IHT), where the support estimation is evaluated and updated in each iteration. Based on restricted isometry property, a unified form of the error bounds of these recovery algorithms is derived under general perturbations for compressible signals. The results reveal that the recovery performance is stable against both perturbations. In addition, these bounds are compared with that of oracle recovery — least squares solution with the locations of some largest entries in magnitude known a priori. The comparison shows that the error bounds of these algorithms only differ in coefficients from the lower bound of oracle recovery for some certain signal and perturbations, as reveals that oracleorder recovery performance of greedy pursuits with replacement is guaranteed. Numerical simulations are performed to verify the conclusions.
Linear Convergence of Stochastic Iterative Greedy Algorithms with Sparse Constraints
, 2014
"... ar ..."
(Show Context)
Sampling Matching Pursuit (CoSaMP), Subspace Pursuit
"... This paper provides theoretical guarantees for denoising performance ..."
(Show Context)
GreedyLike Algorithms for the Cosparse Analysis Model
"... The cosparse analysis model has been introduced recently as an interesting alternative to the standard sparse synthesis approach. A prominent question brought up by this new construction is the analysis pursuit problem – the need to find a signal belonging to this model, given a set of corrupted mea ..."
Abstract
 Add to MetaCart
(Show Context)
The cosparse analysis model has been introduced recently as an interesting alternative to the standard sparse synthesis approach. A prominent question brought up by this new construction is the analysis pursuit problem – the need to find a signal belonging to this model, given a set of corrupted measurements of it. Several pursuit methods have already been proposed based on ℓ1 relaxation and a greedy approach. In this work we pursue this question further, and propose a new family of pursuit algorithms for the cosparse analysis model, mimicking the greedylike methods – compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP), iterative hard thresholding (IHT) and hard thresholding pursuit (HTP). Assuming the availability of a near optimal projection scheme that finds the nearest cosparse subspace to any vector, we provide performance guarantees for these algorithms. Our theoretical study relies on a restricted isometry property adapted to the context of the cosparse analysis model. We explore empirically the performance of these algorithms by adopting a plain thresholding projection, demonstrating their good performance.
RESEARCH Open Access
"... Identification of mouse colonyforming endothelial progenitor cells for postnatal neovascularization: a novel insight highlighted by new mouse colonyforming assay ..."
Abstract
 Add to MetaCart
(Show Context)
Identification of mouse colonyforming endothelial progenitor cells for postnatal neovascularization: a novel insight highlighted by new mouse colonyforming assay