Results 1  10
of
55
A DouglasRachford type primaldual method for solving inclusions with mixtures of composite and parallelsum type monotone operators
, 2013
"... ..."
A Direct Algorithm for 1D Total Variation Denoising
"... Abstract—A very fast noniterative algorithm is proposed for denoising or smoothing onedimensional discrete signals, by solving the total variation regularized leastsquares problem or the related fused lasso problem. A C code implementation is available on the web page of the author. Index Terms—To ..."
Abstract

Cited by 16 (0 self)
 Add to MetaCart
Abstract—A very fast noniterative algorithm is proposed for denoising or smoothing onedimensional discrete signals, by solving the total variation regularized leastsquares problem or the related fused lasso problem. A C code implementation is available on the web page of the author. Index Terms—Total variation, denoising, nonlinear smoothing, fused lasso, regularized leastsquares, nonparametric regression, convex nonsmooth optimization, taut string signal y cumulative sum sequence r hal00675043, version 4 11 Aug 2013 I.
On the convergence rate improvement of a primaldual splitting algorithm for solving monotone inclusion problems
, 2013
"... We present two modified versions of the primaldual splitting algorithm relying on forwardbackward splitting proposed in [21] for solving monotone inclusion problems. Under strong monotonicity assumptions for some of the operators involved we obtain for the sequences of iterates that approach the ..."
Abstract

Cited by 13 (6 self)
 Add to MetaCart
We present two modified versions of the primaldual splitting algorithm relying on forwardbackward splitting proposed in [21] for solving monotone inclusion problems. Under strong monotonicity assumptions for some of the operators involved we obtain for the sequences of iterates that approach the solution orders of convergence of O ( 1n) and O(ωn), for ω ∈ (0, 1), respectively. The investigated primaldual algorithms are fully decomposable, in the sense that the operators are processed individually at each iteration. We also discuss the modified algorithms in the context of convex optimization problems and present numerical experiments in image processing and support vector machines classification.
An inertial forwardbackwardforward primaldual splitting algorithm for solving monotone inclusion problems
, 2015
"... We introduce and investigate the convergence properties of an inertial forwardbackwardforward splitting algorithm for approaching the set of zeros of the sum of a maximally monotone operator and a singlevalued monotone and Lipschitzian operator. By making use of the product space approach, we ex ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
We introduce and investigate the convergence properties of an inertial forwardbackwardforward splitting algorithm for approaching the set of zeros of the sum of a maximally monotone operator and a singlevalued monotone and Lipschitzian operator. By making use of the product space approach, we expand it to the solving of inclusion problems involving mixtures of linearly composed and parallelsum type monotone operators. We obtain in this way an inertial forwardbackwardforward primaldual splitting algorithm having as main characteristic the fact that in the iterative scheme all operators are accessed separately either via forward or via backward evaluations. We present also the variational case when one is interested in the solving of a primaldual pair of convex optimization problems with complexly structured objectives, which we also illustrate by numerical experiments in image processing.
Simultaneous lowpass filtering and total variation denoising
 IEEE TRANS. SIGNAL PROCESS
, 2014
"... This paper seeks to combine linear timeinvariant (LTI) filtering and sparsitybased denoising in a principled way in order to effectively filter (denoise) a wider class of signals. LTI filtering is most suitable for signals restricted to a known frequency band, while sparsitybased denoising is su ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
This paper seeks to combine linear timeinvariant (LTI) filtering and sparsitybased denoising in a principled way in order to effectively filter (denoise) a wider class of signals. LTI filtering is most suitable for signals restricted to a known frequency band, while sparsitybased denoising is suitable for signals admitting a sparse representation with respect to a known transform. However, some signals cannot be accurately categorized as either bandlimited or sparse. This paper addresses the problem of filtering noisy data for the particular case where the underlying signal comprises a lowfrequency component and a sparse or sparsederivative component. A convex optimization approach is presented and two algorithms derived: one based on majorizationminimization (MM), and the other based on the alternating direction method of multipliers (ADMM). It is shown that a particular choice of discretetime filter, namely zerophase noncausal recursive filters for finitelength data formulated in terms of banded matrices, makes the algorithms effective and computationally efficient. The efficiency stems from the use of fast algorithms for solving banded systems of linear equations. The method is illustrated using data from a physiologicalmeasurement technique (i.e., near infrared spectroscopic time series imaging) that in many cases yields data that is wellapproximated as the sum of lowfrequency, sparse or sparsederivative, and noise components.
Solving systems of monotone inclusions via primaldual splitting techniques
, 2013
"... In this paper we propose an algorithm for solving systems of coupled monotone inclusions in Hilbert spaces. The operators arising in each of the inclusions of the system are processed in each iteration separately, namely, the singlevalued are evaluated explicitly (forward steps), while the setva ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
In this paper we propose an algorithm for solving systems of coupled monotone inclusions in Hilbert spaces. The operators arising in each of the inclusions of the system are processed in each iteration separately, namely, the singlevalued are evaluated explicitly (forward steps), while the setvalued ones via their resolvents (backward steps). In addition, most of the steps in the iterative scheme can be executed simultaneously, this making the method applicable to a variety of convex minimization problems. The numerical performances of the proposed splitting algorithm are emphasized through applications in average consensus on colored networks and image classification via support vector machines.
Epigraphical projection and proximal tools for solving constrained convex optimization problems
 Part I,” pp. x+24, 2012, Submitted. Preprint: http://arxiv.org/pdf/1210.5844
"... We propose a proximal approach to deal with convex optimization problems involving nonlinear constraints. A large family of such constraints, proven to be effective in the solution of inverse problems, can be expressed as the lower level set of a sum of convex functions evaluated over different, but ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
We propose a proximal approach to deal with convex optimization problems involving nonlinear constraints. A large family of such constraints, proven to be effective in the solution of inverse problems, can be expressed as the lower level set of a sum of convex functions evaluated over different, but possibly overlapping, blocks of the signal. For this class of constraints, the associated projection operator generally does not have a closed form. We circumvent this difficulty by splitting the lower level set into as many epigraphs as functions involved in the sum. A closed halfspace constraint is also enforced, in order to limit the sum of the introduced epigraphical variables to the upper bound of the original lower level set. In this paper, we focus on a family of constraints involving linear transforms of ℓ1,p balls. Our main theoretical contribution is to provide closed form expressions of the epigraphical projections associated with the Euclidean norm (p = 2) andthe supnorm (p = +∞). The proposed approach is validated in the context of image restoration with missing samples, by making use of TVlike constraints. Experiments show that our method leads to significant improvements in term of convergence speed over existing algorithms for solving similar constrained problems. 1
A forwardbackward view of some primaldual optimization methods in image recovery
 IN PROC. INT. CONF. IMAGE PROCESS
, 2014
"... A wide array of image recovery problems can be abstracted into the problem of minimizing a sum of composite convex functions in a Hilbert space. To solve such problems, primaldual proximal approaches have been developed which provide efficient solutions to largescale optimization problems. The obj ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
A wide array of image recovery problems can be abstracted into the problem of minimizing a sum of composite convex functions in a Hilbert space. To solve such problems, primaldual proximal approaches have been developed which provide efficient solutions to largescale optimization problems. The objective of this paper is to show that a number of existing algorithms can be derived from a general form of the forwardbackward algorithm applied in a suitable product space. Our approach also allows us to develop useful extensions of existing algorithms by introducing a variable metric. An illustration to image restoration is provided.
Convergence rates with inexact nonexpansive operators
"... In this paper, we present a convergence rate analysis for the inexact Krasnosel’skĭı– Mann iteration built from nonexpansive operators. Our results include two main parts: we first establish global pointwise and ergodic iteration–complexity bounds, and then, under a metric subregularity assumption, ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
In this paper, we present a convergence rate analysis for the inexact Krasnosel’skĭı– Mann iteration built from nonexpansive operators. Our results include two main parts: we first establish global pointwise and ergodic iteration–complexity bounds, and then, under a metric subregularity assumption, we establish local linear convergence for the distance of the iterates to the set of fixed points. The obtained iteration–complexity result can be applied to analyze the convergence rate of various monotone operator splitting methods in the literature, including the Forward–Backward, the Generalized Forward–Backward, Douglas–Rachford, ADMM and Primal–Dual splitting methods. For these methods, we also develop easily verifiable termination criteria for finding an approximate solution, which can be seen as a generalization of the termination criterion for the classical gradient descent method. We finally develop a parallel analysis for the nonstationary Krasnosel’skĭı–Mann iteration. The usefulness of our results is illustrated by applying them to a large class of structured monotone inclusion and convex optimization problems. Experiments on some large scale inverse problems in signal and image processing problems are shown.