Results 1  10
of
53
Proximal Splitting Methods in Signal Processing
"... The proximity operator of a convex function is a natural extension of the notion of a projection operator onto a convex set. This tool, which plays a central role in the analysis and the numerical solution of convex optimization problems, has recently been introduced in the arena of inverse problems ..."
Abstract

Cited by 266 (31 self)
 Add to MetaCart
(Show Context)
The proximity operator of a convex function is a natural extension of the notion of a projection operator onto a convex set. This tool, which plays a central role in the analysis and the numerical solution of convex optimization problems, has recently been introduced in the arena of inverse problems and, especially, in signal processing, where it has become increasingly important. In this paper, we review the basic properties of proximity operators which are relevant to signal processing and present optimization methods based on these operators. These proximal splitting methods are shown to capture and extend several wellknown algorithms in a unifying framework. Applications of proximal methods in signal recovery and synthesis are discussed.
Computational methods for sparse solution of linear inverse problems
, 2009
"... The goal of sparse approximation problems is to represent a target signal approximately as a linear combination of a few elementary signals drawn from a fixed collection. This paper surveys the major practical algorithms for sparse approximation. Specific attention is paid to computational issues, ..."
Abstract

Cited by 167 (0 self)
 Add to MetaCart
The goal of sparse approximation problems is to represent a target signal approximately as a linear combination of a few elementary signals drawn from a fixed collection. This paper surveys the major practical algorithms for sparse approximation. Specific attention is paid to computational issues, to the circumstances in which individual methods tend to perform well, and to the theoretical guarantees available. Many fundamental questions in electrical engineering, statistics, and applied mathematics can be posed as sparse approximation problems, making these algorithms versatile and relevant to a wealth of applications.
Minimization of Nonsmooth, Nonconvex Functionals by Iterative Thresholding
, 2009
"... Preprint 10The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will ..."
Abstract

Cited by 128 (2 self)
 Add to MetaCart
(Show Context)
Preprint 10The consecutive numbering of the publications is determined by their chronological order. The aim of this preprint series is to make new research rapidly available for scientific discussion. Therefore, the responsibility for the contents is solely due to the authors. The publications will be distributed by the authors. Minimization of nonsmooth, nonconvex functionals by iterative thresholding
Generalized forwardbackward splitting
, 2011
"... This paper introduces the generalized forwardbackward splitting algorithm for minimizing convex functions of the form F + ∑ n i=1 Gi, where F has a Lipschitzcontinuous gradient and the Gi’s are simple in the sense that their Moreau proximity operators are easy to compute. While the forwardbackwar ..."
Abstract

Cited by 48 (9 self)
 Add to MetaCart
This paper introduces the generalized forwardbackward splitting algorithm for minimizing convex functions of the form F + ∑ n i=1 Gi, where F has a Lipschitzcontinuous gradient and the Gi’s are simple in the sense that their Moreau proximity operators are easy to compute. While the forwardbackward algorithm cannot deal with more than n = 1 nonsmooth function, our method generalizes it to the case of arbitrary n. Our method makes an explicit use of the regularity of F in the forward step, and the proximity operators of the Gi’s are applied in parallel in the backward step. This allows the generalized forwardbackward to efficiently address an important class of convex problems. We prove its convergence in infinite dimension, and its robustness to errors on the computation of the proximity operators and of the gradient of F. Examples on inverse problems in imaging demonstrate the advantage of the proposed methods in comparison to other splitting algorithms.
Signal Restoration with Overcomplete Wavelet Transforms: Comparison of Analysis and Synthesis Priors
"... The variational approach to signal restoration calls for the minimization of a cost function that is the sum of a data fidelity term and a regularization term, the latter term constituting a ‘prior’. A synthesis prior represents the sought signal as a weighted sum of ‘atoms’. On the other hand, an a ..."
Abstract

Cited by 47 (5 self)
 Add to MetaCart
The variational approach to signal restoration calls for the minimization of a cost function that is the sum of a data fidelity term and a regularization term, the latter term constituting a ‘prior’. A synthesis prior represents the sought signal as a weighted sum of ‘atoms’. On the other hand, an analysis prior models the coefficients obtained by applying the forward transform to the signal. For orthonormal transforms, the synthesis prior and analysis prior are equivalent; however, for overcomplete transforms the two formulations are different. We compare analysis and synthesis ℓ1norm regularization with overcomplete transforms for denoising and deconvolution.
Convergence rates and source conditions for Tikhonov regularization with sparsity constraints
, 2008
"... This paper addresses the regularization by sparsity constraints by means of weighted ℓ p penalties for 0 ≤ p ≤ 2. For 1 ≤ p ≤ 2 special attention is payed to convergence rates in norm and to source conditions. As main results it is proven that one gets a convergence rate of √ δ in the 2norm for 1 & ..."
Abstract

Cited by 44 (15 self)
 Add to MetaCart
(Show Context)
This paper addresses the regularization by sparsity constraints by means of weighted ℓ p penalties for 0 ≤ p ≤ 2. For 1 ≤ p ≤ 2 special attention is payed to convergence rates in norm and to source conditions. As main results it is proven that one gets a convergence rate of √ δ in the 2norm for 1 < p ≤ 2 and in the 1norm for p = 1 as soon as the unknown solution is sparse. The case p = 1 needs a special technique where not only Bregman distances but also a socalled BregmanTaylor distance has to be employed. For p < 1 only preliminary results are shown. These results indicate that, different from p ≥ 1, the regularizing properties depend on the interplay of the operator and the basis of sparsity. A counterexample for p = 0 shows that regularization need not to happen.
Nested iterative algorithms for convex constrained image recovery problems
 IEEE Journal of Selected Topics in Signal Processing
, 2007
"... The objective of this paper is to develop methods for solving image recovery problems subject to constraints on the solution. More precisely, we will be interested in problems which can be formulated as the minimization over a closed convex constraint set of the sum of two convex functions f and g, ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
(Show Context)
The objective of this paper is to develop methods for solving image recovery problems subject to constraints on the solution. More precisely, we will be interested in problems which can be formulated as the minimization over a closed convex constraint set of the sum of two convex functions f and g, where f may be nonsmooth and g is differentiable with a Lipschitzcontinuous gradient. To reach this goal, we derive two types of algorithms that combine forwardbackward and DouglasRachford iterations. The weak convergence of the proposed algorithms is proved. In the case when the Lipschitzcontinuity property of the gradient of g is not satisfied, we also show that, under some assumptions, it remains possible to apply these methods to the considered optimization problem by making use of a quadratic extension technique. The effectiveness of the algorithms is demonstrated for two waveletbased image restoration problems involving a signaldependent Gaussian noise and a Poisson noise, respectively. 1