Results 1  10
of
53
Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
, 2010
"... ..."
An augmented Lagrangian approach to the constrained optimization formulation of imaging inverse problems
 IEEE Trans. Image Process
, 2011
"... Abstract—We propose a new fast algorithm for solving one of the standard approaches to illposed linear inverse problems (IPLIP), where a (possibly nonsmooth) regularizer is minimized under the constraint that the solution explains the observations sufficiently well. Although the regularizer and con ..."
Abstract

Cited by 92 (9 self)
 Add to MetaCart
Abstract—We propose a new fast algorithm for solving one of the standard approaches to illposed linear inverse problems (IPLIP), where a (possibly nonsmooth) regularizer is minimized under the constraint that the solution explains the observations sufficiently well. Although the regularizer and constraint are usually convex, several particular features of these problems (huge dimensionality, nonsmoothness) preclude the use of offtheshelf optimization tools and have stimulated a considerable amount of research. In this paper, we propose a new efficient algorithm to handle one class of constrained problems (often known as basis pursuit denoising) tailored to image recovery applications. The proposed algorithm, which belongs to the family of augmented Lagrangian methods, can be used to deal with a variety of imaging IPLIP, including deconvolution and reconstruction from compressive observations (such as MRI), using either totalvariation or waveletbased (or, more generally, framebased) regularization. The proposed algorithm is an instance of the socalled alternating direction method of multipliers, for which convergence sufficient conditions are known; we show that these conditions are satisfied by the proposed algorithm. Experiments on a set of image restoration and reconstruction benchmark problems show that the proposed algorithm is a strong contender for the stateoftheart. Index Terms—Convex optimization, frames, image reconstruction, image restoration, inpainting, totalvariation. A. Problem Formulation
Generalized forwardbackward splitting
, 2011
"... This paper introduces the generalized forwardbackward splitting algorithm for minimizing convex functions of the form F + ∑ n i=1 Gi, where F has a Lipschitzcontinuous gradient and the Gi’s are simple in the sense that their Moreau proximity operators are easy to compute. While the forwardbackwar ..."
Abstract

Cited by 48 (9 self)
 Add to MetaCart
This paper introduces the generalized forwardbackward splitting algorithm for minimizing convex functions of the form F + ∑ n i=1 Gi, where F has a Lipschitzcontinuous gradient and the Gi’s are simple in the sense that their Moreau proximity operators are easy to compute. While the forwardbackward algorithm cannot deal with more than n = 1 nonsmooth function, our method generalizes it to the case of arbitrary n. Our method makes an explicit use of the regularity of F in the forward step, and the proximity operators of the Gi’s are applied in parallel in the backward step. This allows the generalized forwardbackward to efficiently address an important class of convex problems. We prove its convergence in infinite dimension, and its robustness to errors on the computation of the proximity operators and of the gradient of F. Examples on inverse problems in imaging demonstrate the advantage of the proposed methods in comparison to other splitting algorithms.
Online Alternating Direction Method
 In ICML
, 2012
"... Online optimization has emerged as powerful tool in large scale optimization. In this paper, we introduce efficient online algorithms based on the alternating directions method (ADM). We introduce a new proof technique for ADM in the batch setting, which yields the O(1/T) convergence rate of ADM and ..."
Abstract

Cited by 37 (9 self)
 Add to MetaCart
(Show Context)
Online optimization has emerged as powerful tool in large scale optimization. In this paper, we introduce efficient online algorithms based on the alternating directions method (ADM). We introduce a new proof technique for ADM in the batch setting, which yields the O(1/T) convergence rate of ADM and forms the basis of regret analysis in the online setting. We consider two scenarios in the online setting, based on whether the solution needs to lie in the feasible set or not. In both settings, we establish regret bounds for both the objective function as well as constraint violation for general and strongly convex functions. Preliminary results are presented to illustrate the performance of the proposed algorithms. 1.
A splittingbased iterative algorithm for accelerated statistical Xray CT reconstruction
 Medical Imaging, IEEE Transactions on
, 2012
"... Abstract—Statistical image reconstruction using penalized weighted leastsquares (PWLS) criteria can improve imagequality in Xray CT. However, the huge dynamic range of the statistical weights leads to a highly shiftvariant inverse problem making it difficult to precondition and accelerate existi ..."
Abstract

Cited by 27 (8 self)
 Add to MetaCart
(Show Context)
Abstract—Statistical image reconstruction using penalized weighted leastsquares (PWLS) criteria can improve imagequality in Xray CT. However, the huge dynamic range of the statistical weights leads to a highly shiftvariant inverse problem making it difficult to precondition and accelerate existing iterative algorithms that attack the statistical model directly. We propose to alleviate the problem by using a variablesplitting scheme that separates the shiftvariant and (“nearly”) invariant components of the statistical data model and also decouples the regularization term. This leads to an equivalent constrained problem that we tackle using the classical methodofmultipliers framework with alternating minimization. The specific form of our splitting yields an alternating direction method of multipliers (ADMM) algorithm with an innerstep involving a “nearly ” shiftinvariant linear system that is suitable for FFTbased preconditioning using conetype filters. The proposed method can efficiently handle a variety of convex regularization criteria including smooth edgepreserving regularizers and nonsmooth sparsitypromoting ones based on the ℓ1norm and total variation. Numerical experiments with synthetic and real in vivo human data illustrate that conefilter preconditioners accelerate the proposed ADMM resulting in fast convergence of ADMM compared to conventional (nonlinear conjugate gradient, ordered subsets) and stateoftheart (MFISTA, splitBregman) algorithms that are applicable for CT.
Parallel proximal algorithm for image restoration using hybrid regularization
 IEEE Transactions on Image Processing
, 2011
"... Regularization approaches have demonstrated their effectiveness for solving illposed problems. However, in the context of variational restoration methods, a challenging question remains, namely how to find a good regularizer. While total variation introduces staircase effects, wavelet domain regula ..."
Abstract

Cited by 25 (8 self)
 Add to MetaCart
(Show Context)
Regularization approaches have demonstrated their effectiveness for solving illposed problems. However, in the context of variational restoration methods, a challenging question remains, namely how to find a good regularizer. While total variation introduces staircase effects, wavelet domain regularization brings other artefacts, e.g. ringing. However, a tradeoff can be made by introducing a hybrid regularization including several terms non necessarily acting in the same domain (e.g. spatial and wavelet transform domains). While this approachwas shown to provide good results for solving deconvolution problems in the presence of additive Gaussian noise, an important issue is to efficiently deal with this hybrid regularization for more general noise models. To solve this problem, we adopt a convex optimization framework where the criterion to be minimized is split in the sum of more than two terms. For spatial domain regularization, isotropic or anisotropic total variation definitions using various gradient filters are considered. An accelerated version of the Parallel Proximal Algorithm is proposed to perform the minimization. Some difficulties in the computation of the proximity operators involved in this algorithm are also addressed in this paper. Numerical experiments performed in the context of Poisson data recovery, show the good behaviour of the algorithm as well as promising results concerning the use of hybrid regularization techniques.
A fast algorithm for the constrained formulation of compressive image reconstruction and other linear inverse problems, Available at http://arxiv.org/abs/0909.3947v1
, 2009
"... Illposed linear inverse problems (ILIP), such as restoration and reconstruction, are a core topic of signal/image processing. A standard approach to deal with ILIP uses a constrained optimization problem, where a regularization function is minimized under the constraint that the solution explains t ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
(Show Context)
Illposed linear inverse problems (ILIP), such as restoration and reconstruction, are a core topic of signal/image processing. A standard approach to deal with ILIP uses a constrained optimization problem, where a regularization function is minimized under the constraint that the solution explains the observations sufficiently well. The regularizer and constraint are usually convex; however, several particular features of these problems (huge dimensionality, nonsmoothness) preclude the use of offtheshelf optimization tools and have stimulated much research. In this paper, we propose a new efficient algorithm to handle one class of constrained problems (known as basis pursuit denoising) tailored to image recovery applications.
Minimization and parameter estimation for seminorm regularization models with Idivergence constraints
, 2012
"... In this papers we analyze the minimization of seminorms ‖L · ‖ on R n under the constraint of a bounded Idivergence D(b,H·) for rather general linear operators H and L. The Idivergence is also known as KullbackLeibler divergence and appears in many models in imaging science, in particular when d ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
(Show Context)
In this papers we analyze the minimization of seminorms ‖L · ‖ on R n under the constraint of a bounded Idivergence D(b,H·) for rather general linear operators H and L. The Idivergence is also known as KullbackLeibler divergence and appears in many models in imaging science, in particular when dealing with Poisson data. Often H represents, e.g., a linear blur operator and L is some discrete derivative or frame analysis operator. We prove relations between the the parameters of Idivergence constrained and penalized problems without assuming the uniqueness of their minimizers. To solve the Idivergence constrained problem we apply firstorder primaldual algorithms which reduce the problem to the solution of certain proximal minimization problems in each iteration step. One of these proximation problems is an Idivergence constrained least squares problem which can be solved based on Morosov’s discrepancy principle by a Newton method. Interestingly, the algorithm produces not only a sequence of vectors which converges to a minimizer of the constrained problem but also a sequence of parameters which convergences to a regularization parameter so that the corresponding penalized problem has the same solution as our constrained one. We demonstrate the performance of various algorithms for different image restoration tasks both for images corrupted by Poisson noise and multiplicative Gamma noise. 1
Bregman Alternating Direction Method of Multipliers
"... The mirror descent algorithm (MDA) generalizes gradient descent by using a Bregman divergence to replace squared Euclidean distance. In this paper, we similarly generalize the alternating direction method of multipliers (ADMM) to Bregman ADMM (BADMM), which allows the choice of different Bregman div ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
The mirror descent algorithm (MDA) generalizes gradient descent by using a Bregman divergence to replace squared Euclidean distance. In this paper, we similarly generalize the alternating direction method of multipliers (ADMM) to Bregman ADMM (BADMM), which allows the choice of different Bregman divergences to exploit the structure of problems. BADMM provides a unified framework for ADMM and its variants, including generalized ADMM, inexact ADMM and Bethe ADMM. We establish the global convergence and the O(1/T) iteration complexity for BADMM. In some cases, BADMM can be faster than ADMM by a factor of O(n / log(n)). In solving the linear program of mass transportation problem, BADMM leads to massive parallelism and can easily run on GPU. BADMM is several times faster than highly optimized commercial software Gurobi. 1
Relaxing tight frame condition in parallel proximal methods for signal restoration
 IEEE Transactions on Signal Processing
, 2012
"... ar ..."
(Show Context)