Results 1  10
of
17
Various thresholds for ℓ1optimization in compressed sensing
, 2009
"... Recently, [14, 28] theoretically analyzed the success of a polynomial ℓ1optimization algorithm in solving an underdetermined system of linear equations. In a large dimensional and statistical context [14, 28] proved that if the number of equations (measurements in the compressed sensing terminolog ..."
Abstract

Cited by 33 (17 self)
 Add to MetaCart
Recently, [14, 28] theoretically analyzed the success of a polynomial ℓ1optimization algorithm in solving an underdetermined system of linear equations. In a large dimensional and statistical context [14, 28] proved that if the number of equations (measurements in the compressed sensing terminology) in the system is proportional to the length of the unknown vector then there is a sparsity (number of nonzero elements of the unknown vector) also proportional to the length of the unknown vector such that ℓ1optimization succeeds in solving the system. In this paper, we provide an alternative performance analysis of ℓ1optimization and obtain the proportionality constants that in certain cases match or improve on the best currently known ones from [28, 29].
Thresholded basis pursuit: Support recovery for sparse and approximately sparse signals
, 2009
"... In this paper we present a linear programming solution for support recovery. Support recovery involves the estimation of sign pattern of a sparse signal from a set of randomly projected noisy measurements. Our solution of the problem amounts to solving min ‖Z‖1 s.t. Y = GZ, and quantizing/thresholdi ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
In this paper we present a linear programming solution for support recovery. Support recovery involves the estimation of sign pattern of a sparse signal from a set of randomly projected noisy measurements. Our solution of the problem amounts to solving min ‖Z‖1 s.t. Y = GZ, and quantizing/thresholding the resulting solution Z. We show that this scheme is guaranteed to perfectly reconstruct a discrete signal or control the elementwise reconstruction error for a continuous signal for specific values of sparsity. We show that the sign pattern of X can be recovered with SNR = O(log n) and m = O(k log n/k) measurements, where k is the sparsity level and satisfies 0 < k ≤ αn, where, α is some nonzero constant. Our proof technique is based on perturbation of the noiseless ℓ1 problem. Consequently, the maximum achievable sparsity level in the noisy problem is comparable to that of the noiseless problem. Our result offers a sharp characterization in that neither the SNR nor the sparsity ratio can be significantly improved. In contrast previous results based on LASSO and MAXCorrelation techniques either assume significantly larger SNR or sublinear sparsity. Our results has implications for approximately sparse problems. We show that the k largest coefficients of a nonsparse signal X can be recovered from m = O(k log n/k) random projections for certain classes of signals.
Linear underdetermined systems with sparse solutions: Redirecting a challenge? available at arXiv
"... ar ..."
(Show Context)