#### DMCA

## Compressed Sensing and Redundant Dictionaries

Citations: | 136 - 13 self |

### Citations

3542 | Compressed sensing
- Donoho
- 2006
(Show Context)
Citation Context ...s Pursuit, thresholding, Orthogonal Matching Pursuit 1 Introduction Recently there has been a growing interest in recovering sparse signals from their projection onto a small number of random vectors =-=[4, 5, 8, 13, 19, 20]-=-. The word most often used in this context is compressed sensing. It originates from the idea that it is not necessary to invest a lot of power into observing the entries of a sparse signal in all coo... |

2681 | Atomic decomposition by basis pursuit
- Chen, Donoho, et al.
- 1998
(Show Context)
Citation Context ...∑ |xi| denotes the ℓ1-norm. This can be done via linear programming in the real case and via cone programming in the complex case. Clearly, one hopes that the solutions of (P0) and (P1) coincide, see =-=[6, 9]-=- for details. Both approaches pose certain requirements on the matrix Ψ in order to ensure recovery success. Recently, Candès, Romberg and Tao [4, 5] observed that successful recovery by BP 2is guara... |

1642 | Matching pursuit with time-frequency dictionaries
- Mallat, Zhang
- 1993
(Show Context)
Citation Context ... severe drawback there have been basically two approaches proposed in the signal recovery community. The first is using greedy algorithms like Thresholding [14] or (Orthogonal) Matching Pursuit (OMP) =-=[16, 21]-=-. Thresholding simply calculates the inner products of the signal with all atoms, finds the ones with largest absolute values and then calculates the orthogonal projection onto the span of the corresp... |

1476 | Practical signal recovery from random projections
- Candès, Romberg
- 2005
(Show Context)
Citation Context ...s Pursuit, thresholding, Orthogonal Matching Pursuit 1 Introduction Recently there has been a growing interest in recovering sparse signals from their projection onto a small number of random vectors =-=[4, 5, 8, 13, 19, 20]-=-. The word most often used in this context is compressed sensing. It originates from the idea that it is not necessary to invest a lot of power into observing the entries of a sparse signal in all coo... |

1360 | Stable signal recovery from incomplete and inaccurate measurements
- Candés, Romberg, et al.
- 2005
(Show Context)
Citation Context ...s Pursuit, thresholding, Orthogonal Matching Pursuit 1 Introduction Recently there has been a growing interest in recovering sparse signals from their projection onto a small number of random vectors =-=[4, 5, 8, 13, 19, 20]-=-. The word most often used in this context is compressed sensing. It originates from the idea that it is not necessary to invest a lot of power into observing the entries of a sparse signal in all coo... |

919 | k-svd: An algorithm for designing overcomplete dictionaries for sparse representation - Aharon, Elad, et al. - 2006 |

901 | Greed is good: Algorithmic results for sparse approximation
- Tropp
(Show Context)
Citation Context ... severe drawback there have been basically two approaches proposed in the signal recovery community. The first is using greedy algorithms like Thresholding [14] or (Orthogonal) Matching Pursuit (OMP) =-=[16, 21]-=-. Thresholding simply calculates the inner products of the signal with all atoms, finds the ones with largest absolute values and then calculates the orthogonal projection onto the span of the corresp... |

769 | Signal recovery from random measurements via orthogonal matching pursuit - Tropp, Gilbert - 2007 |

620 | A simple proof of the restricted isometry property for random matrices - Baraniuk, Davenport, et al. |

609 | Convergence and Empirical Processes - Vaart, Wellner - 1996 |

550 | For most Large underdetermined systems of linear equations the minimal ℓ1-norm solution is also the sparsest solution - Donoho - 2006 |

544 |
Sparse approximate solutions to linear system
- NATARAJAN
- 1995
(Show Context)
Citation Context ... entries of x and ‖ · ‖2 denotes the standard Euclidean norm. Although there are simple recovery conditions available, the above approach is not reasonable in practice because its solution is NP-hard =-=[7, 18]-=-. In order to avoid this severe drawback there have been basically two approaches proposed in the signal recovery community. The first is using greedy algorithms like Thresholding [14] or (Orthogonal)... |

453 | Stable recovery of sparse overcomplete representations in the presence of noise
- Donoho, Elad, et al.
(Show Context)
Citation Context ...∑ |xi| denotes the ℓ1-norm. This can be done via linear programming in the real case and via cone programming in the complex case. Clearly, one hopes that the solutions of (P0) and (P1) coincide, see =-=[6, 9]-=- for details. Both approaches pose certain requirements on the matrix Ψ in order to ensure recovery success. Recently, Candès, Romberg and Tao [4, 5] observed that successful recovery by BP 2is guara... |

328 | Probability in Banach Spaces: isoperimetry and processes, volume 23 - Ledoux, Talagrand - 1991 |

235 | Database-friendly random projections
- Achlioptas
- 2001
(Show Context)
Citation Context ...trix of the type Ψ = AΦ, where A is an n × d measurement matrix and Φ is a d × K dictionary, we will follow the approach taken in [2], which was inspired by proofs for the Johnson-Lindenstrauss lemma =-=[1]-=-. We will not discuss this connection further but use as starting point concentration of measure for random variables. This describes the phenomenon that in high dimensions the probability mass of cer... |

189 | Signal recovery from partial information via orthogonal matching pursuit
- Tropp, Gilbert
- 2005
(Show Context)
Citation Context |

186 | Adaptive Greedy Approximations
- DAVIS, MALLAT, et al.
- 1997
(Show Context)
Citation Context ... entries of x and ‖ · ‖2 denotes the standard Euclidean norm. Although there are simple recovery conditions available, the above approach is not reasonable in practice because its solution is NP-hard =-=[7, 18]-=-. In order to avoid this severe drawback there have been basically two approaches proposed in the signal recovery community. The first is using greedy algorithms like Thresholding [14] or (Orthogonal)... |

181 | Probability inequalities for the sum of independent random variables - Bennett - 1962 |

110 | Counting faces of randomly-projected polytopes when the projection radically lowers dimension
- Donoho, Tanner
- 2006
(Show Context)
Citation Context ...not optimal. In the case of a Gaussian ensemble A and an orthonormal basis Φ recovery conditions for BP with quite small constants were obtained in [20] and precise asymptotic results can be found in =-=[10]-=-. One might raise the objection that the condition S − 1 ≤ 1 16µ in Corollary 2.4 is too weak for practial applications. A lower bound on the coherence in terms of the dictionary size is √ K − d µ > d... |

108 | Sparse reconstruction by convex relaxation: Fourier and gaussian measurements
- Rudelson, Vershynin
- 2006
(Show Context)
Citation Context |

96 | Uniform uncertainty principle for bernoulli and subgaussian ensembles. Constructive Approximation 28
- MENDELSON, PAJOR, et al.
- 2008
(Show Context)
Citation Context ...ose a finite ε1-covering of the unit sphere in R S , i.e., a set of points Q, with ‖q‖ = 1 for all q ∈ Q, such that for all ‖x‖ = 1 min ‖x − q‖ ≤ ε1 q∈Q for some ε1 ∈ (0,1). According to Lemma 2.2 in =-=[17]-=- there exists such a Q with |Q| ≤ (1 + 2/ε1) S . Applying the measure concentration in (2.1) with ε2 < 1/3 to all the points ΦΛq and taking the union bound we get (1 − ε2)‖ΦΛq‖ 2 ≤ ‖AΦΛq‖ 2 ≤ (1 + ε2)... |

71 | Random sampling of sparse trigonometric polynomials II - orthogonal matching pursuit versus basis pursuit - Kunis, Rauhut - 2008 |

29 |
The JohnsonLindenstrauss lemma meets compressed
- Baraniuk, Davenport, et al.
(Show Context)
Citation Context ...r some other distribution showing certain concentration properties, see below) will have small restricted isometry constants δS with ’overwhelming probability’ as long as n = O(S log(d/S)), (1.3) see =-=[2, 4, 5, 20]-=- for details. The results for OMP in compressed sensing are weaker than for BP. While it can again be shown that with high probability a signal can be reconstructed from the random measurements Ψx if ... |

17 | Average case analysis of multichannel thresholding
- Gribnoval, Mailhe, et al.
- 2007
(Show Context)
Citation Context ...n is NP-hard [7, 18]. In order to avoid this severe drawback there have been basically two approaches proposed in the signal recovery community. The first is using greedy algorithms like Thresholding =-=[14]-=- or (Orthogonal) Matching Pursuit (OMP) [16, 21]. Thresholding simply calculates the inner products of the signal with all atoms, finds the ones with largest absolute values and then calculates the or... |

4 | Fast ℓ1 minimization by iterative thresholding for multidimensional NMR spectroscopy
- Drori
- 1155
(Show Context)
Citation Context ...st one dramatically increases the class of signals that can be modelled in this way. A more practical example would be a dictionary made up of damped sinusoids which is used for NMR spectroscopy, see =-=[12]-=-. Before we can go into further explanations about the scope of this paper it is necessary to provide some background information. The basic problem in compressed sensing is to determine the minimal n... |