Results 1 
9 of
9
Compressive Phase Retrieval via Generalized Approximate Message Passing
"... Abstract—In this paper, we propose a novel approach to compressive phase retrieval based on loopy belief propagation and, in particular, on the generalized approximate message passing (GAMP) algorithm. Numerical results show that the proposed PRGAMP algorithm has excellent phasetransition behavior ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
(Show Context)
Abstract—In this paper, we propose a novel approach to compressive phase retrieval based on loopy belief propagation and, in particular, on the generalized approximate message passing (GAMP) algorithm. Numerical results show that the proposed PRGAMP algorithm has excellent phasetransition behavior, noise robustness, and runtime. In particular, for successful recovery of synthetic BernoullicircularGaussian signals, PRGAMP requires ≈ 4 times the number of measurements as a phaseoracle version of GAMP and, at moderate to large SNR, the NMSE of PRGAMP is only ≈ 3 dB worse than that of phaseoracle GAMP. A comparison to the recently proposed convexrelation approach known as “CPRL ” reveals PRGAMP’s superior phase transition and ordersofmagnitude faster runtimes, especially as the problem dimensions increase. When applied to the recovery of a 65kpixel grayscale image from 32k randomly masked magnitude measurements, numerical results show a median PRGAMP runtime of only 13.4 seconds. A. Phase retrieval I.
Message passing approaches to compressive inference under structured signal priors
, 2013
"... Across numerous disciplines, the ability to generate highdimensional datasets is driving an enormous demand for increasingly efficient ways of both capturing and processing this data. A promising recent trend for addressing these needs has developed from the recognition that, despite living in hi ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
Across numerous disciplines, the ability to generate highdimensional datasets is driving an enormous demand for increasingly efficient ways of both capturing and processing this data. A promising recent trend for addressing these needs has developed from the recognition that, despite living in highdimensional ambient spaces, many datasets have vastly smaller intrinsic dimensionality. When capturing (sampling) such datasets, exploiting this realization permits one to dramatically reduce the number of samples that must be acquired without losing the salient features of the data. When processing such datasets, the reduced intrinsic dimensionality can be leveraged to allow reliable inferences to be made in scenarios where it is infeasible to collect the amount of data that would be required for inference using classical techniques. To date, most approaches for taking advantage of the low intrinsic dimensionality inherent in many datasets have focused on identifying succinct (i.e., sparse) representations of the data, seeking to represent the data using only a handful of “significant ” elements from an appropriately chosen dictionary. While powerful in
Binary linear classification and feature selection via generalized approximate message passing
 in Information Sciences and Systems (CISS), 2014 48th Annual Conference on
, 2014
"... Abstract—For the problem of binary linear classification and feature selection, we propose algorithmic approaches to classifier design based on the generalized approximate message passing (GAMP) algorithm, recently proposed in the context of compressive sensing. We are particularly motivated by pro ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract—For the problem of binary linear classification and feature selection, we propose algorithmic approaches to classifier design based on the generalized approximate message passing (GAMP) algorithm, recently proposed in the context of compressive sensing. We are particularly motivated by problems where the number of features greatly exceeds the number of training examples, but where only a few features suffice for accurate classification. We show that sumproduct GAMP can be used to (approximately) minimize the classification error rate and maxsum GAMP can be used to minimize a wide variety of regularized loss functions. Furthermore, we describe an expectationmaximization (EM)based scheme to learn the associated model parameters online, as an alternative to crossvalidation, and we show that GAMP’s stateevolution framework can be used to accurately predict the misclassification rate. Finally, we present a detailed numerical study to confirm the accuracy, speed, and flexibility afforded by our GAMPbased approaches to binary linear classification and feature selection. Index Terms—Belief propagation, classification, feature selection, message passing, onebit compressed sensing.
Approximate Message Passingbased Compressed Sensing Reconstruction with Generalized Elastic Net
"... In this paper, we study the compressed sensing reconstruction problem with generalized elastic net prior (GENP), where a sparse signal is sampled via a noisy underdetermined linear observation system, and an additional initial estimation of the signal (the GENP) is available during the reconstructi ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper, we study the compressed sensing reconstruction problem with generalized elastic net prior (GENP), where a sparse signal is sampled via a noisy underdetermined linear observation system, and an additional initial estimation of the signal (the GENP) is available during the reconstruction. We first incorporate the GENP into the LASSO and the approximate message passing (AMP) frameworks, denoted by GENPLASSO and GENPAMP respectively. We then focus on GENPAMP and investigate its parameter selection, state evolution, and noisesensitivity analysis. A practical parameterless version of the GENPAMP is also developed, which does not need to know the sparsity of the unknown signal and the variance of the GENP. Simulation results with 1D data and two different imaging applications are presented to demonstrate the efficiency of the proposed schemes. Keywords: Compressed sensing, approximate message passing, elastic net prior, state evolution, phase transition. 1.
TwoDimensional PatternCoupled Sparse Bayesian Learning via Generalized Approximate Message Passing
"... AbstractWe consider the problem of recovering twodimensional (2D) blocksparse signals with unknown cluster patterns. Twodimensional blocksparse patterns arise naturally in many practical applications such as foreground detection and inverse synthetic aperture radar imaging. To exploit the bloc ..."
Abstract
 Add to MetaCart
(Show Context)
AbstractWe consider the problem of recovering twodimensional (2D) blocksparse signals with unknown cluster patterns. Twodimensional blocksparse patterns arise naturally in many practical applications such as foreground detection and inverse synthetic aperture radar imaging. To exploit the blocksparse structure, we introduce a 2D patterncoupled hierarchical Gaussian prior model to characterize the statistical pattern dependencies among neighboring coefficients. Unlike the conventional hierarchical Gaussian prior model where each coefficient is associated independently with a unique hyperparameter, the patterncoupled prior for each coefficient not only involves its own hyperparameter, but also its immediate neighboring hyperparameters. Thus the sparsity patterns of neighboring coefficients are related to each other and the hierarchical model has the potential to encourage 2D structuredsparse solutions. An expectationmaximization (EM) strategy is employed to obtain the maximum a posterior (MAP) estimate of the hyperparameters, along with the posterior distribution of the sparse signal. In addition, the generalized approximate message passing (GAMP) algorithm is embedded into the EM framework to efficiently compute an approximation of the posterior distribution of hidden variables, which results in a significant reduction in computational complexity. Numerical results are provided to illustrate the effectiveness of the proposed algorithm. Index TermsPatterncoupled sparse Bayesian learning, blocksparse structure, expectationmaximization (EM), generalized approximate message passing (GAMP).
EmpiricalBayes Approaches to Recovery of Structured Sparse Signals via . . .
, 2015
"... In recent years, there have been massive increases in both the dimensionality and sample sizes of data due to everincreasing consumer demand coupled with relatively inexpensive sensing technologies. These highdimensional datasets bring challenges such as complexity, along with numerous opportuniti ..."
Abstract
 Add to MetaCart
In recent years, there have been massive increases in both the dimensionality and sample sizes of data due to everincreasing consumer demand coupled with relatively inexpensive sensing technologies. These highdimensional datasets bring challenges such as complexity, along with numerous opportunities. Though many signals of interest live in a highdimensional ambient space, they often have a much smaller inherent dimensionality which, if leveraged, lead to improved recoveries. For example, the notion of sparsity is a requisite in the compressive sensing (CS) field, which allows for accurate signal reconstruction from subNyquist sampled measurements given certain conditions. When recovering a sparse signal from noisy compressive linear measurements, the distribution of the signal’s nonzero coefficients can have a profound effect on recovery meansquared error (MSE). If this distribution is apriori known, then one could use computationally efficient approximate message passing (AMP) techniques that yield approximate minimum MSE (MMSE) estimates or critical points to the maxi
SUBMITTED TO IEEE TRANS. ON SIGNAL PROCESSING 1 Approximate Message Passingbased Compressed Sensing Reconstruction with Generalized Elastic Net Prior
"... In this paper, we study the compressed sensing reconstruction problem with generalized elastic net prior (GENP), where a sparse signal is sampled via a noisy underdetermined linear observation system, and an additional initial estimation of the signal (the GENP) is available during the reconstructio ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper, we study the compressed sensing reconstruction problem with generalized elastic net prior (GENP), where a sparse signal is sampled via a noisy underdetermined linear observation system, and an additional initial estimation of the signal (the GENP) is available during the reconstruction. We first incorporate the GENP into the LASSO and the approximate message passing (AMP) frameworks, denoted by GENPLASSO and GENPAMP respectively. We then investigate the parameter selection, state evolution, and noisesensitivity analysis of GENPAMP. We show that, thanks to the GENP, there is no phase transition boundary in the proposed frameworks, i.e., the reconstruction error is bounded in the entire plane. The error is also smaller than those of the standard AMP and scalar denoising. A practical parameterless version of the GENPAMP is also developed, which does not need to know the sparsity of the unknown signal and the variance of the GENP. Simulation results are presented to verify the efficiency of the proposed schemes. Index Terms Compressed sensing, approximate message passing, elastic net prior, state evolution, phase transition.
1 Compressive Phase Retrieval via Generalized Approximate Message Passing
"... ar ..."
(Show Context)