Results 1  10
of
45
On the conditions used to prove oracle results for the Lasso
 Electron. J. Stat
"... Abstract: Oracle inequalities and variable selection properties for the Lasso in linear models have been established under a variety of different assumptions on the design matrix. We show in this paper how the different conditions and concepts relate to each other. The restricted eigenvalue conditio ..."
Abstract

Cited by 103 (5 self)
 Add to MetaCart
(Show Context)
Abstract: Oracle inequalities and variable selection properties for the Lasso in linear models have been established under a variety of different assumptions on the design matrix. We show in this paper how the different conditions and concepts relate to each other. The restricted eigenvalue condition [2] or the slightly weaker compatibility condition [18] are sufficient for oracle results. We argue that both these conditions allow for a fairly general class of design matrices. Hence, optimality of the Lasso for prediction and estimation holds for more general situations than what it appears from coherence [5, 4] or restricted isometry [10] assumptions.
Shifting Inequality and Recovery of Sparse Signals
 IEEE Transactions on Signal Processing
"... Abstract—In this paper, we present a concise and coherent analysis of the constrained `1 minimization method for stable recovering of highdimensional sparse signals both in the noiseless case and noisy case. The analysis is surprisingly simple and elementary, while leads to strong results. In parti ..."
Abstract

Cited by 64 (12 self)
 Add to MetaCart
(Show Context)
Abstract—In this paper, we present a concise and coherent analysis of the constrained `1 minimization method for stable recovering of highdimensional sparse signals both in the noiseless case and noisy case. The analysis is surprisingly simple and elementary, while leads to strong results. In particular, it is shown that the sparse recovery problem can be solved via `1 minimization under weaker conditions than what is known in the literature. A key technical tool is an elementary inequality, called Shifting Inequality, which, for a given nonnegative decreasing sequence, bounds the `2 norm of a subsequence in terms of the `1 norm of another subsequence by shifting the elements to the upper end. Index Terms — 1 minimization, restricted isometry property, shifting inequality, sparse recovery. I.
Orthogonal matching pursuit for sparse signal recovery with noise
 Information Theory, IEEE Transactions on
, 2011
"... We consider the orthogonal matching pursuit (OMP) algorithm for the recovery of a highdimensional sparse signal based on a small number of noisy linear measurements. OMP is an iterative greedy algorithm that selects at each step the column which is most correlated with the current residuals. In thi ..."
Abstract

Cited by 56 (1 self)
 Add to MetaCart
We consider the orthogonal matching pursuit (OMP) algorithm for the recovery of a highdimensional sparse signal based on a small number of noisy linear measurements. OMP is an iterative greedy algorithm that selects at each step the column which is most correlated with the current residuals. In this paper, we present a fully data driven OMP algorithm with explicit stopping rules. It is shown that under conditions on the mutual incoherence and the minimum magnitude of the nonzero components of the signal, the support of the signal can be recovered exactly by the OMP algorithm with high probability. In addition, we also consider the problem of identifying significant components in the case where some of the nonzero components are possibly small. It is shown that in this case the OMP algorithm will still select all the significant components before possibly selecting incorrect ones. Moreover, with modified stopping rules, the OMP algorithm can ensure that no zero components are selected.
CoherenceBased Performance Guarantees for Estimating a Sparse Vector Under Random Noise
"... We consider the problem of estimating a deterministic sparse vector x0 from underdetermined measurements Ax0 + w, where w represents white Gaussian noise and A is a given deterministic dictionary. We analyze the performance of three sparse estimation algorithms: basis pursuit denoising (BPDN), orth ..."
Abstract

Cited by 43 (15 self)
 Add to MetaCart
(Show Context)
We consider the problem of estimating a deterministic sparse vector x0 from underdetermined measurements Ax0 + w, where w represents white Gaussian noise and A is a given deterministic dictionary. We analyze the performance of three sparse estimation algorithms: basis pursuit denoising (BPDN), orthogonal matching pursuit (OMP), and thresholding. These algorithms are shown to achieve nearoracle performance with high probability, assuming that x0 is sufficiently sparse. Our results are nonasymptotic and are based only on the coherence of A, so that they are applicable to arbitrary dictionaries. Differences in the precise conditions required for the performance guarantees of each algorithm are manifested in the observed performance at high and low signaltonoise ratios. This provides insight on the advantages and drawbacks of ℓ1 relaxation techniques such as BPDN as opposed to greedy approaches such as OMP and thresholding.
Limiting laws of coherence of random matrices with applications to testing covariance structure and construction of compressed sensing matrices
 Ann. Stat
, 2011
"... Testing covariance structure is of significant interest in many areas of statistical analysis and construction of compressed sensing matrices is an important problem in signal processing. Motivated by these applications, we study in this paper the limiting laws of the coherence of an n×p random matr ..."
Abstract

Cited by 30 (10 self)
 Add to MetaCart
Testing covariance structure is of significant interest in many areas of statistical analysis and construction of compressed sensing matrices is an important problem in signal processing. Motivated by these applications, we study in this paper the limiting laws of the coherence of an n×p random matrix in the highdimensional setting where p can be much larger than n. Both the law of large numbers and the limiting distribution are derived. We then consider testing the bandedness of the covariance matrix of a high dimensional Gaussian distribution which includes testing for independence as a special case. The limiting laws of the coherence of the data matrix play a critical role in the construction of the test. We also apply the asymptotic results to the construction of compressed sensing matrices.
Recovery of sparsely corrupted signals
 IEEE Trans. Inf. Theory
, 2012
"... Abstract—We investigate the recovery of signals exhibiting a sparse representation in a general (i.e., possibly redundant or incomplete) dictionary that are corrupted by additive noise admitting a sparse representation in another general dictionary. This setup covers a wide range of applications, su ..."
Abstract

Cited by 25 (8 self)
 Add to MetaCart
(Show Context)
Abstract—We investigate the recovery of signals exhibiting a sparse representation in a general (i.e., possibly redundant or incomplete) dictionary that are corrupted by additive noise admitting a sparse representation in another general dictionary. This setup covers a wide range of applications, such as image inpainting, superresolution, signal separation, and recovery of signals that are impaired by, e.g., clipping, impulse noise, or narrowband interference. We present deterministic recovery guarantees based on a novel uncertainty relation for pairs of general dictionaries and we provide corresponding practicable recovery algorithms. The recovery guarantees we find depend on the signal and noise sparsity levels, on the coherence parameters of the involved dictionaries, and on the amount of prior knowledge about the signal and noise support sets. Index Terms—Uncertainty relations, signal restoration, signal separation, coherencebased recovery guarantees, `1norm minimization, greedy algorithms. I.
Sharp RIP bound for sparse signal and lowrank matrix recovery
 Appl. Comput. Harmon. Anal
, 2013
"... ar ..."
(Show Context)
RIPBased NearOracle Performance Guarantees for SP, CoSaMP, and IHT
"... Abstract—This correspondence presents an average case denoising performance analysis for SP, CoSaMP, and IHT algorithms. This analysis considers the recovery of a noisy signal, with the assumptions that it is corrupted by an additive random zeromean white Gaussian noise and has asparse representat ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
(Show Context)
Abstract—This correspondence presents an average case denoising performance analysis for SP, CoSaMP, and IHT algorithms. This analysis considers the recovery of a noisy signal, with the assumptions that it is corrupted by an additive random zeromean white Gaussian noise and has asparse representation with respect to a known dictionary D. The proposed analysis is based on the RIP, establishing a nearoracle performance guarantee for each of these algorithms. Beyond bounds for the reconstruction error that hold with high probability, in this work we also provide a bound for the average error. Index Terms—Additive white noise, compressed sensing, Gaussian noise, signal denoising, signal reconstruction, signal representations. I.
Phase transition in limiting distributions of coherence of highdimensional random matrices
, 2012
"... The coherence of a random matrix, which is defined to be the largest magnitude of the Pearson correlation coefficients between the columns of the random matrix, is an important quantity for a wide range of applications including highdimensional statistics and signal processing. Inspired by these ap ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
The coherence of a random matrix, which is defined to be the largest magnitude of the Pearson correlation coefficients between the columns of the random matrix, is an important quantity for a wide range of applications including highdimensional statistics and signal processing. Inspired by these applications, this paper studies the limiting laws of the coherence of n × p random matrices for a full range of the dimension p with a special focus on the ultra highdimensional setting. Assuming the columns of the random matrix are independent random vectors with a common spherical distribution, we give a complete characterization of the behavior of the limiting distributions of the coherence. More specifically, the limiting distributions of the coherence are derived log p → ∞. The results show that the limiting behavior of the coherence differs significantly in different regimes and exhibits interesting phase transition phenomena as the dimension p grows as a function of n. Applications to statistics and compressed sensing in the ultra highdimensional setting are also discussed. separately for three regimes: 1 n log p → 0,