### An inertial forward-backward algorithm for the minimization of the sum of two nonconvex functions

, 2015

"... Abstract. We propose a forward-backward proximal-type algorithm with inertial/memory effects for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting. Every sequence of iterates generated by the algorithm converges to a critical point of the objective function provid ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract. We propose a forward-backward proximal-type algorithm with inertial/memory effects for minimizing the sum of a nonsmooth function with a smooth one in the nonconvex setting. Every sequence of iterates generated by the algorithm converges to a critical point of the objective function provided an appropriate regularization of the objective satisfies the Kurdyka- Lojasiewicz inequality, which is for instance fulfilled for semi-algebraic functions. We illustrate the theoretical results by considering two numerical experiments: the first one concerns the ability of recovering the local optimal solutions of nonconvex optimization problems, while the second one refers to the restoration of a noisy blurred image.

### To cite this version:

, 2014

"... HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."

Abstract
- Add to MetaCart

(Show Context)
HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés. On damped second-order gradient systems Pascal Bégout∗, Jérôme Bolte † and Mohamed Ali Jendoubi‡ Using small deformations of the total energy, as introduced in [28], we establish that damped second order gradient systems u ′′(t) + γu′(t) +∇G(u(t)) = 0, may be viewed as quasi-gradient systems. In order to study the asymptotic behavior of these systems, we prove that any (non trivial) desingularizing function appearing in KL inequality satisfies ϕ(s)> O( s)) whenever the original function is definable and C2. Variants to this result are given. These facts are used in turn to prove that a desingularizing function of the potential G also desingularizes the total energy and its deformed versions. Our approach brings forward several results interesting for their own sake: we provide an asymptotic alternative for quasi-gradient systems, either a trajectory converges, or its norm tends to infinity. The convergence rates are also analyzed by an original method based on a one dimensional worst-case gradient system. We conclude by establishing the convergence of damped second order systems in various cases including the definable case. The real-analytic case is recovered and some results concerning convex functions are also derived. 1

### The final publication is available at link.springer.com. Discrete Green’s Functions for Harmonic and Biharmonic Inpainting with Sparse Atoms

"... Abstract. Recent research has shown that inpainting with the Laplace or biharmonic operator has a high potential for image compression, if the stored data is optimised and sufficiently sparse. The goal of our paper is to connect these linear inpainting methods to sparsity concepts. To understand the ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract. Recent research has shown that inpainting with the Laplace or biharmonic operator has a high potential for image compression, if the stored data is optimised and sufficiently sparse. The goal of our paper is to connect these linear inpainting methods to sparsity concepts. To understand these relations, we explore the theory of Green’s functions. In contrast to most work in the mathematical literature, we derive our Green’s functions in a discrete setting and on a rectangular image do-main with homogeneous Neumann boundary conditions. These discrete Green’s functions can be interpreted as columns of the Moore–Penrose inverse of the discretised differential operator. More importantly, they serve as atoms in a dictionary that allows a sparse representation of the inpainting solution. Apart from offering novel theoretical insights, this representation is also simple to implement and computationally efficient if the inpainting data is sparse.

### Minimization of Non-smooth, Non-convex Functionals by Iterative Thresholding

, 2014

"... Abstract Convergence analysis is carried out for a forward-backward splitting/ generalized gradient projection method for the minimization of a special class of non-smooth and genuinely non-convex minimization problems in infinite dimen-sional Hilbert spaces. The functionals under consideration are ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract Convergence analysis is carried out for a forward-backward splitting/ generalized gradient projection method for the minimization of a special class of non-smooth and genuinely non-convex minimization problems in infinite dimen-sional Hilbert spaces. The functionals under consideration are the sum of a smooth, possibly non-convex and non-smooth, necessarily non-convex functional. For separa-ble constraints in the sequence space, we show that generalized gradient projection method amounts to a discontinuous iterative thresholding procedure, which can easily be implemented. In this case we prove strong subsequential convergence, and moreover, show that the limit satisfies strengthened necessary conditions for a global minimizer, i.e., it avoids a certain set of non-global minimizers. Even-tually, the method is applied to problems arising in the recovery of sparse data, where strong convergence of the whole sequence is shown, and numerical tests are presented.

### Bilevel Optimization with Nonsmooth Lower Level Problems

"... Abstract. We consider a bilevel optimization approach for parameter learning in nonsmooth variational models. Existing approaches solve this problem by applying implicit differentiation to a sufficiently smooth ap-proximation of the nondifferentiable lower level problem. We propose an alternative me ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract. We consider a bilevel optimization approach for parameter learning in nonsmooth variational models. Existing approaches solve this problem by applying implicit differentiation to a sufficiently smooth ap-proximation of the nondifferentiable lower level problem. We propose an alternative method based on differentiating the iterations of a nonlin-ear primal–dual algorithm. Our method computes exact (sub)gradients and can be applied also in the nonsmooth setting. We show preliminary results for the case of multi-label image segmentation. 1

### Higher-order MRFs based image super resolution: MMSE or MAP?

"... has proved a highly effective image prior model for many classic image restoration problems. Generally, two options are available to incorporate the learned FoE prior in the inference procedure: (1) sampling-based minimum mean square error (MMSE) estimate, and (2) energy minimization-based maximum a ..."

Abstract
- Add to MetaCart

(Show Context)
has proved a highly effective image prior model for many classic image restoration problems. Generally, two options are available to incorporate the learned FoE prior in the inference procedure: (1) sampling-based minimum mean square error (MMSE) estimate, and (2) energy minimization-based maximum a posteriori (MAP) estimate. It is well-known the sampling-based MMSE estimate is very time consuming, but the MAP inference has a remarkable advantage of high computational efficiency. In a recent paper, the FoE prior model was exploited for the single image super resolution (SR) task by using the MMSE inference, based on a seemingly correct conclusion that the MAP inference of the FoE prior based model, which leads a non-convex optimization problem, is prone to getting stuck in some bad local minima. However, in this letter, we demonstrate that this simpler inference criterion- the MAP estimate, works equally well compared to the complicated MMSE estimate with exactly the same prior model. Moreover, with our discriminatively trained FoE prior model, the MAP inference can even lead to further improvements. Consequently, we argue that for higher-order natural image prior based SR problem, it is not necessary to employ the time consuming MMSE estimation. Index Terms—Bayesian minimum mean square error, a maxi-mum a posteriori, Fields of Experts, single image super resolution I.

### A Multi-step Inertial Forward-Backward Splitting Method for Non-convex Optimization

"... Abstract We propose a multi-step inertial Forward-Backward splitting algorithm for minimizing the sum of two non-necessarily convex functions, one of which is proper lower semi-continuous while the other is differentiable with a Lipschitz continuous gradient. We first prove global convergence of th ..."

Abstract
- Add to MetaCart

(Show Context)
Abstract We propose a multi-step inertial Forward-Backward splitting algorithm for minimizing the sum of two non-necessarily convex functions, one of which is proper lower semi-continuous while the other is differentiable with a Lipschitz continuous gradient. We first prove global convergence of the algorithm with the help of the Kurdyka-Łojasiewicz property. Then, when the non-smooth part is also partly smooth relative to a smooth submanifold, we establish finite identification of the latter and provide sharp local linear convergence analysis. The proposed method is illustrated on several problems arising from statistics and machine learning.