Results 1  10
of
49
A firstorder primaldual algorithm for convex problems with applications to imaging
, 2010
"... In this paper we study a firstorder primaldual algorithm for convex optimization problems with known saddlepoint structure. We prove convergence to a saddlepoint with rate O(1/N) in finite dimensions, which is optimal for the complete class of nonsmooth problems we are considering in this paper ..."
Abstract

Cited by 435 (20 self)
 Add to MetaCart
In this paper we study a firstorder primaldual algorithm for convex optimization problems with known saddlepoint structure. We prove convergence to a saddlepoint with rate O(1/N) in finite dimensions, which is optimal for the complete class of nonsmooth problems we are considering in this paper. We further show accelerations of the proposed algorithm to yield optimal rates on easier problems. In particular we show that we can achieve O(1/N 2) convergence on problems, where the primal or the dual objective is uniformly convex, and we can show linear convergence, i.e. O(1/e N) on problems where both are uniformly convex. The wide applicability of the proposed algorithm is demonstrated on several imaging problems such as image denoising, image deconvolution, image inpainting, motion estimation and image segmentation. 1
An introduction to total variation for image analysis
 in Theoretical Foundations and Numerical Methods for Sparse Recovery, De Gruyter
, 2010
"... These notes address various theoretical and practical topics related to Total Variationbased image reconstruction. They focuse first on some theoretical results on functions which minimize the total variation, and in a second part, describe a few standard and less standard algorithms to minimize th ..."
Abstract

Cited by 44 (3 self)
 Add to MetaCart
(Show Context)
These notes address various theoretical and practical topics related to Total Variationbased image reconstruction. They focuse first on some theoretical results on functions which minimize the total variation, and in a second part, describe a few standard and less standard algorithms to minimize the total variation in a finitedifferences setting, with a series of applications from simple denoising to stereo, or deconvolution issues, and even more exotic uses like the minimization of minimal partition problems.
GLOBAL SOLUTIONS OF VARIATIONAL MODELS WITH CONVEX REGULARIZATION
"... Abstract. We propose an algorithmic framework to compute global solutions of variational models with convex regularity terms that permit quite arbitrary data terms. While the minimization of variational problems with convex data and regularity terms is straight forward (using for example gradient de ..."
Abstract

Cited by 32 (9 self)
 Add to MetaCart
(Show Context)
Abstract. We propose an algorithmic framework to compute global solutions of variational models with convex regularity terms that permit quite arbitrary data terms. While the minimization of variational problems with convex data and regularity terms is straight forward (using for example gradient descent), this is no longer trivial for functionals with nonconvex data terms. Using the theoretical framework of calibrations the original variational problem can be written as the maximum flux of a particular vector field going through the boundary of the subgraph of the unknown function. Upon relaxation this formulation turns the problem into a convex problem, however, in higher dimension. In order to solve this problem, we propose a fast primal dual algorithm which significantly outperforms existing algorithms. In experimental results we show the application of our method to outlier filtering of range images and disparity estimation in stereo images using a variety of convex regularity terms. Key words. Variational methods, calibrations, total variation, convex optimization. AMS subject classifications. 49M20, 49M29, 65K15, 68U10. 1. Introduction. Energy
X.: Convergence analysis of primaldual algorithms for total variation image restoration
, 2010
"... Abstract. Recently, some attractive primaldual algorithms have been proposed for solving a saddlepoint problem, with particular applications in the area of total variation (TV) image restoration. This paper focuses on the convergence analysis of existing primaldual algorithms and shows that the i ..."
Abstract

Cited by 29 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Recently, some attractive primaldual algorithms have been proposed for solving a saddlepoint problem, with particular applications in the area of total variation (TV) image restoration. This paper focuses on the convergence analysis of existing primaldual algorithms and shows that the involved parameters of those primaldual algorithms (including the step sizes) can be significantly enlarged if some simple correction steps are supplemented. As a result, we present some primaldualbased contraction methods for solving the saddlepoint problem. These contraction methods are in the predictioncorrection fashion in the sense that the predictor is generated by a primaldual method and it is corrected by some simple correction step at each iteration. In addition, based on the context of contraction type methods, we provide a novel theoretical framework for analyzing the convergence of primaldual algorithms which simplifies existing convergence analysis substantially.
Continuous Multiclass Labeling Approaches and Algorithms
 SIAM J. Imag. Sci
, 2011
"... We study convex relaxations of the image labeling problem on a continuous domain with regularizers based on metric interaction potentials. The generic framework ensures existence of minimizers and covers a wide range of relaxations of the originally combinatorial problem. We focus on two specific r ..."
Abstract

Cited by 27 (4 self)
 Add to MetaCart
(Show Context)
We study convex relaxations of the image labeling problem on a continuous domain with regularizers based on metric interaction potentials. The generic framework ensures existence of minimizers and covers a wide range of relaxations of the originally combinatorial problem. We focus on two specific relaxations that differ in flexibility and simplicity – one can be used to tightly relax any metric interaction potential, while the other one only covers Euclidean metrics but requires less computational effort. For solving the nonsmooth discretized problem, we propose a globally convergent DouglasRachford scheme, and show that a sequence of dual iterates can be recovered in order to provide a posteriori optimality bounds. In a quantitative comparison to two other firstorder methods, the approach shows competitive performance on synthetical and realworld images. By combining the method with an improved binarization technique for nonstandard potentials, we were able to routinely recover discrete solutions within 1%–5 % of the global optimum for the combinatorial image labeling problem. 1 Problem Formulation The multiclass image labeling problem consists in finding, for each pixel x in the image domain Ω ⊆ Rd, a label `(x) ∈ {1,..., l} which assigns one of l class labels to x so that the labeling function ` adheres to some local data fidelity as well as nonlocal spatial coherency constraints. This problem class occurs in many applications, such as segmentation, multiview reconstruction, stitching, and inpainting [PCF06]. We consider the variational formulation inf `:Ω→{1,...,l} f(`), f(`):= Ω s(x, `(x))dx ︸ ︷ ︷ ︸ data term + J(`). ︸ ︷ ︷ ︸ regularizer
Fast and exact primaldual iterations for variational problems in computer vision
 In Computer Vision–ECCV 2010
, 2010
"... Abstract. The saddle point framework provides a convenient way to formulate many convex variational problems that occur in computer vision. The framework unifies a broad range of data and regularization terms, and is particularly suited for nonsmooth problems such as Total Variationbased approach ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The saddle point framework provides a convenient way to formulate many convex variational problems that occur in computer vision. The framework unifies a broad range of data and regularization terms, and is particularly suited for nonsmooth problems such as Total Variationbased approaches to image labeling. However, for many interesting problems the constraint sets involved are difficult to handle numerically. Stateoftheart methods rely on using nested iterative projections, which induces both theoretical and practical convergence issues. We present a dual multipleconstraint DouglasRachford splitting approach that is globally convergent, avoids inner iterative loops, enforces the constraints exactly, and requires only basic operations that can be easily parallelized. The method outperforms existing methods by a factor of 4−20 while considerably increasing the numerical robustness. 1
A convex approach for variational superresolution
 In Proceedings German Association for Pattern Recognition (DAGM), volume 6376 of LNCS
, 2010
"... Abstract. We propose a convex variational framework to compute high resolution images from a low resolution video. The image formation process is analyzed to provide to a well designed model for warping, blurring, downsampling and regularization. We provide a comprehensive investigation of the singl ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We propose a convex variational framework to compute high resolution images from a low resolution video. The image formation process is analyzed to provide to a well designed model for warping, blurring, downsampling and regularization. We provide a comprehensive investigation of the single model components. The superresolution problem is modeled as a minimization problem in an unified convex framework, which is solved by a fast primal dual algorithm. A comprehensive evaluation on the influence of different kinds of noise is carried out. The proposed algorithm shows excellent recovery of information for various real and synthetic datasets. 1
COMBINATORIAL CONTINUOUS MAXIMUM FLOW
, 2011
"... Maximum flow (and minimumcut) algorithms have had a strong impact on computer vision. In particular, graph cuts algorithms provide a mechanism for the discrete optimization of an energy functional which has been used in a variety of applications such as image segmentation, stereo, image stitching an ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
Maximum flow (and minimumcut) algorithms have had a strong impact on computer vision. In particular, graph cuts algorithms provide a mechanism for the discrete optimization of an energy functional which has been used in a variety of applications such as image segmentation, stereo, image stitching and texture synthesis. Algorithms based on the classical formulation of maxflow defined on a graph are known to exhibit metrication artefacts in the solution. Therefore, a recent trend has been to instead employ a spatially continuous maximum flow (or the dual mincut problem) in these same applications to produce solutions with no metrication errors. However, known fast continuous maxflow algorithms have no stopping criteria or have not been proved to converge. In this work, we revisit the continuous maxflow problem and show that the analogous discrete formulation is different from the classical maxflow problem. We then apply an appropriate combinatorial optimization technique to this combinatorial continuous maxflow (CCMF) problem to find a nulldivergence solution that exhibits no metrication artefacts and may be solved exactly by a fast, efficient algorithm with provable convergence. Finally, by exhibiting the dual problem of our CCMF formulation, we clarify the fact, already proved by Nozawa in the continuous setting, that the maxflow and the total variation problems are not always equivalent.
A TWOSTAGE IMAGE SEGMENTATION METHOD USING A CONVEX VARIANT OF THE MUMFORDSHAH MODEL AND THRESHOLDING
"... Abstract. The MumfordShah model is one of the most important image segmentation models, and has been studied extensively in the last twenty years. In this paper, we propose a twostage segmentation method based on the MumfordShah model. The first stage of our method is to find a smooth solution g ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Abstract. The MumfordShah model is one of the most important image segmentation models, and has been studied extensively in the last twenty years. In this paper, we propose a twostage segmentation method based on the MumfordShah model. The first stage of our method is to find a smooth solution g to a convex variant of the MumfordShah model. Once g is obtained, then in the second stage, the segmentation is done by thresholding g into different phases. The thresholds can be given by the users or can be obtained automatically using any clustering methods. Because of the convexity of the model, g can be solved efficiently by techniques like the splitBregman algorithm or the ChambollePock method. We prove that our method is convergent and the solution g is always unique. In our method, there is no need to specify the number of segments K (K ≥ 2) before finding g. We can obtain any Kphase segmentations by choosing (K −1) thresholds after g is found in the first stage; and in the second stage there is no need to recompute g if the thresholds are changed to reveal different segmentation features in the image. Experimental results show that our twostage method performs better than many standard twophase or multiphase segmentation methods for very general images, including antimass, tubular, MRI, noisy, and blurry images. Key words. Image segmentation, MumfordShah model, splitBregman, total variation. AMS subject classifications. 52A41, 65D15, 68W40, 90C25, 90C90
An inertial forwardbackward algorithm for monotone inclusions
 J. Math. Imaging Vis
, 2014
"... In this paper, we propose a new accelerated forward backward splitting algorithm to compute a zero of the sum of two monotone operators, with one of the two operators being cocoercive. The algorithm is inspired by the accelerated gradient method of Nesterov, but can be applied to a much larger clas ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
In this paper, we propose a new accelerated forward backward splitting algorithm to compute a zero of the sum of two monotone operators, with one of the two operators being cocoercive. The algorithm is inspired by the accelerated gradient method of Nesterov, but can be applied to a much larger class of problems including convexconcave saddle point problems and general monotone inclusions. We prove convergence of the algorithm in a Hilbert space setting and show that several recently proposed firstorder methods can be obtained as special cases of the general algorithm. Numerical results show that the proposed algorithm converges faster than existing methods, while keeping the computational cost of each iteration basically unchanged. 1