Results 1  10
of
350
A Singular Value Thresholding Algorithm for Matrix Completion
, 2008
"... This paper introduces a novel algorithm to approximate the matrix with minimum nuclear norm among all matrices obeying a set of convex constraints. This problem may be understood as the convex relaxation of a rank minimization problem, and arises in many important applications as in the task of reco ..."
Abstract

Cited by 539 (20 self)
 Add to MetaCart
This paper introduces a novel algorithm to approximate the matrix with minimum nuclear norm among all matrices obeying a set of convex constraints. This problem may be understood as the convex relaxation of a rank minimization problem, and arises in many important applications as in the task of recovering a large matrix from a small subset of its entries (the famous Netflix problem). Offtheshelf algorithms such as interior point methods are not directly amenable to large problems of this kind with over a million unknown entries. This paper develops a simple firstorder and easytoimplement algorithm that is extremely efficient at addressing problems in which the optimal solution has low rank. The algorithm is iterative and produces a sequence of matrices {X k, Y k} and at each step, mainly performs a softthresholding operation on the singular values of the matrix Y k. There are two remarkable features making this attractive for lowrank matrix completion problems. The first is that the softthresholding operation is applied to a sparse matrix; the second is that the rank of the iterates {X k} is empirically nondecreasing. Both these facts allow the algorithm to make use of very minimal storage space and keep the computational cost of each iteration low. On
NESTA: A Fast and Accurate FirstOrder Method for Sparse Recovery
, 2009
"... Accurate signal recovery or image reconstruction from indirect and possibly undersampled data is a topic of considerable interest; for example, the literature in the recent field of compressed sensing is already quite immense. Inspired by recent breakthroughs in the development of novel firstorder ..."
Abstract

Cited by 177 (2 self)
 Add to MetaCart
(Show Context)
Accurate signal recovery or image reconstruction from indirect and possibly undersampled data is a topic of considerable interest; for example, the literature in the recent field of compressed sensing is already quite immense. Inspired by recent breakthroughs in the development of novel firstorder methods in convex optimization, most notably Nesterov’s smoothing technique, this paper introduces a fast and accurate algorithm for solving common recovery problems in signal processing. In the spirit of Nesterov’s work, one of the key ideas of this algorithm is a subtle averaging of sequences of iterates, which has been shown to improve the convergence properties of standard gradientdescent algorithms. This paper demonstrates that this approach is ideally suited for solving largescale compressed sensing reconstruction problems as 1) it is computationally efficient, 2) it is accurate and returns solutions with several correct digits, 3) it is flexible and amenable to many kinds of reconstruction problems, and 4) it is robust in the sense that its excellent performance across a wide range of problems does not depend on the fine tuning of several parameters. Comprehensive numerical experiments on realistic signals exhibiting a large dynamic range show that this algorithm compares favorably with recently proposed stateoftheart methods. We also apply the algorithm to solve other problems for which there are fewer alternatives, such as totalvariation minimization, and
Bregmanized Nonlocal Regularization for Deconvolution and Sparse Reconstruction ∗
"... We propose two algorithms based on Bregman iteration and operator splitting technique for nonlocal TV regularization problems. The convergence of the algorithms is analyzed and applications to deconvolution and sparse reconstruction are presented. 1 ..."
Abstract

Cited by 88 (9 self)
 Add to MetaCart
(Show Context)
We propose two algorithms based on Bregman iteration and operator splitting technique for nonlocal TV regularization problems. The convergence of the algorithms is analyzed and applications to deconvolution and sparse reconstruction are presented. 1
A primaldual splitting method for convex optimization involving Lipschitzian, proximable and linear composite terms
, 2013
"... We propose a new firstorder splitting algorithm for solving jointly the primal and dual formulations of largescale convex minimization problems involving the sum of a smooth function with Lipschitzian gradient, a nonsmooth proximable function, and linear composite functions. This is a full splitti ..."
Abstract

Cited by 60 (10 self)
 Add to MetaCart
We propose a new firstorder splitting algorithm for solving jointly the primal and dual formulations of largescale convex minimization problems involving the sum of a smooth function with Lipschitzian gradient, a nonsmooth proximable function, and linear composite functions. This is a full splitting approach, in the sense that the gradient and the linear operators involved are applied explicitly without any inversion, while the nonsmooth functions are processed individually via their proximity operators. This work brings together and notably extends several classical splitting schemes, like the forward–backward and Douglas–Rachford methods, as well as the recent primal–dual method of Chambolle and Pock designed for problems with linear composite terms.
Geometric Applications of the Split Bregman Method: Segmentation and Surface Reconstruction
, 2009
"... Variational models for image segmentation have many applications, but can be slow to compute. Recently, globally convex segmentation models have been introduced which are very reliable, but contain TVregularizers, making them difficult to compute. The previously introduced Split Bregman method is a ..."
Abstract

Cited by 58 (7 self)
 Add to MetaCart
(Show Context)
Variational models for image segmentation have many applications, but can be slow to compute. Recently, globally convex segmentation models have been introduced which are very reliable, but contain TVregularizers, making them difficult to compute. The previously introduced Split Bregman method is a technique for fast minimization of L1 regularized functionals, and has been applied to denoising and compressed sensing problems. By applying the Split Bregman concept to image segmentation problems, we build fast solvers which can outperform more conventional schemes, such as duality based methods and graphcuts. We also consider the related problem of surface reconstruction from unorganized data points, which is used for constructing level set representations in 3 dimensions.
Restoration of Poissonian images using alternating direction optimization
 IEEE Trans. Image Process
, 2010
"... Abstract—Much research has been devoted to the problem of restoring Poissonian images, namely for medical and astronomical applications. However, the restoration of these images using stateoftheart regularizers (such as those based upon multiscale representations or total variation) is still an a ..."
Abstract

Cited by 54 (5 self)
 Add to MetaCart
(Show Context)
Abstract—Much research has been devoted to the problem of restoring Poissonian images, namely for medical and astronomical applications. However, the restoration of these images using stateoftheart regularizers (such as those based upon multiscale representations or total variation) is still an active research area, since the associated optimization problems are quite challenging. In this paper, we propose an approach to deconvolving Poissonian images, which is based upon an alternating direction optimization method. The standard regularization [or maximum a posteriori (MAP)] restoration criterion, which combines the Poisson loglikelihood with a (nonsmooth) convex regularizer (logprior), leads to hard optimization problems: the loglikelihood is nonquadratic and nonseparable, the regularizer is nonsmooth, and there is a nonnegativity constraint. Using standard convex analysis tools, we present sufficient conditions for existence and uniqueness of solutions of these optimization problems, for several types of regularizers: totalvariation, framebased analysis, and framebased synthesis. We attack these problems with an instance of the alternating direction method of multipliers (ADMM), which belongs to the family of augmented Lagrangian algorithms. We study sufficient conditions for convergence and show that these are satisfied, either under totalvariation or framebased (analysis and synthesis) regularization. The resulting algorithms are shown to outperform alternative stateoftheart methods, both in terms of speed and restoration accuracy. Index Terms—Alternating direction methods, augmented Lagrangian, convex optimization, image deconvolution, image restoration, Poisson images. I.
Augmented Lagrangian method, dual methods, and split Bregman iteration for ROF, vectorial TV, and high order models
 SIAM Journal on Imaging Sciences
"... E. Fatemi, Physica D, 60(1992), pp. 259–268] based on total variation (TV) minimization has proven to be very useful. A lot of efforts have been devoted to obtain fast numerical schemes and overcome the nondifferentiability of the model. Methods considered to be particularly efficient for the ROF m ..."
Abstract

Cited by 51 (8 self)
 Add to MetaCart
(Show Context)
E. Fatemi, Physica D, 60(1992), pp. 259–268] based on total variation (TV) minimization has proven to be very useful. A lot of efforts have been devoted to obtain fast numerical schemes and overcome the nondifferentiability of the model. Methods considered to be particularly efficient for the ROF model include the dual methods of ChanGolubMulet (CGM) [T.F. Chan, G.H. Golub, and P. Mulet, SIAM J. Sci. Comput., 20(1999), pp. 1964–1977] and Chambolle [A. Chambolle, J. Math. Imaging
Fast algorithms for nonconvex compressive sensing: MRI reconstruction from very few data
 Int. Symp. Biomedical Imaing
, 2009
"... Compressive sensing is the reconstruction of sparse images or signals from very few samples, by means of solving a tractable optimization problem. In the context of MRI, this can allow reconstruction from many fewer kspace samples, thereby reducing scanning time. Previous work has shown that noncon ..."
Abstract

Cited by 48 (2 self)
 Add to MetaCart
(Show Context)
Compressive sensing is the reconstruction of sparse images or signals from very few samples, by means of solving a tractable optimization problem. In the context of MRI, this can allow reconstruction from many fewer kspace samples, thereby reducing scanning time. Previous work has shown that nonconvex optimization reduces still further the number of samples required for reconstruction, while still being tractable. In this work, we extend recent Fourierbased algorithms for convex optimization to the nonconvex setting, and obtain methods that combine the reconstruction abilities of previous nonconvex approaches with the computational speed of stateoftheart convex methods. Index Terms — Magnetic resonance imaging, image reconstruction, compressive sensing, nonconvex optimization.
Signal Restoration with Overcomplete Wavelet Transforms: Comparison of Analysis and Synthesis Priors
"... The variational approach to signal restoration calls for the minimization of a cost function that is the sum of a data fidelity term and a regularization term, the latter term constituting a ‘prior’. A synthesis prior represents the sought signal as a weighted sum of ‘atoms’. On the other hand, an a ..."
Abstract

Cited by 47 (5 self)
 Add to MetaCart
The variational approach to signal restoration calls for the minimization of a cost function that is the sum of a data fidelity term and a regularization term, the latter term constituting a ‘prior’. A synthesis prior represents the sought signal as a weighted sum of ‘atoms’. On the other hand, an analysis prior models the coefficients obtained by applying the forward transform to the signal. For orthonormal transforms, the synthesis prior and analysis prior are equivalent; however, for overcomplete transforms the two formulations are different. We compare analysis and synthesis ℓ1norm regularization with overcomplete transforms for denoising and deconvolution.
Alternating direction algorithms for ℓ1problems in compressive sensing
, 2009
"... Abstract. In this paper, we propose and study the use of alternating direction algorithms for several ℓ1norm minimization problems arising from sparse solution recovery in compressive sensing, including the basis pursuit problem, the basispursuit denoising problems of both unconstrained and constr ..."
Abstract

Cited by 46 (5 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper, we propose and study the use of alternating direction algorithms for several ℓ1norm minimization problems arising from sparse solution recovery in compressive sensing, including the basis pursuit problem, the basispursuit denoising problems of both unconstrained and constrained forms, as well as others. We present and investigate two classes of algorithms derived from either the primal or the dual forms of the ℓ1problems. The construction of the algorithms consists of two main steps: (1) to reformulate an ℓ1problem into one having partially separable objective functions by adding new variables and constraints; and (2) to apply an exact or inexact alternating direction method to the resulting problem. The derived alternating direction algorithms can be regarded as firstorder primaldual algorithms because both primal and dual variables are updated at each and every iteration. Convergence properties of these algorithms are established or restated when they already exist. Extensive numerical results in comparison with several stateoftheart algorithms are given to demonstrate that the proposed algorithms are efficient, stable and robust. Moreover, we present numerical results to emphasize two practically important but perhaps overlooked points. One point is that algorithm speed should always be evaluated relative to appropriate solution accuracy; another is that whenever erroneous measurements possibly exist, the ℓ1norm fidelity should be the fidelity of choice in compressive sensing. Key words. Sparse solution recovery, compressive sensing, ℓ1minimization, primal, dual, alternating direction method