Results 1 -
5 of
5
Minimization and parameter estimation for seminorm regularization models with I-divergence constraints
, 2012
"... In this papers we analyze the minimization of seminorms ‖L · ‖ on R n under the constraint of a bounded I-divergence D(b,H·) for rather general linear operators H and L. The I-divergence is also known as Kullback-Leibler divergence and appears in many models in imaging science, in particular when d ..."
Abstract
-
Cited by 13 (2 self)
- Add to MetaCart
(Show Context)
In this papers we analyze the minimization of seminorms ‖L · ‖ on R n under the constraint of a bounded I-divergence D(b,H·) for rather general linear operators H and L. The I-divergence is also known as Kullback-Leibler divergence and appears in many models in imaging science, in particular when dealing with Poisson data. Often H represents, e.g., a linear blur operator and L is some discrete derivative or frame analysis operator. We prove relations between the the parameters of I-divergence constrained and penalized problems without assuming the uniqueness of their minimizers. To solve the I-divergence constrained problem we apply first-order primal-dual algorithms which reduce the problem to the solution of certain proximal minimization problems in each iteration step. One of these proximation problems is an I-divergence constrained least squares problem which can be solved based on Morosov’s discrepancy principle by a Newton method. Interestingly, the algorithm produces not only a sequence of vectors which converges to a minimizer of the constrained problem but also a sequence of parameters which convergences to a regularization parameter so that the corresponding penalized problem has the same solution as our constrained one. We demonstrate the performance of various algorithms for different image restoration tasks both for images corrupted by Poisson noise and multiplicative Gamma noise. 1
Homogeneous Penalizers and Constraints in Convex Image Restoration
, 2012
"... Recently convex optimization models were successfully applied for solving various prob-lems in image analysis and restoration. In this paper, we are interested in relations between convex constrained optimization problems of the form argmin{Φ(x) subject to Ψ(x) ≤ τ} and their penalized counterparts ..."
Abstract
-
Cited by 4 (1 self)
- Add to MetaCart
Recently convex optimization models were successfully applied for solving various prob-lems in image analysis and restoration. In this paper, we are interested in relations between convex constrained optimization problems of the form argmin{Φ(x) subject to Ψ(x) ≤ τ} and their penalized counterparts argmin{Φ(x) + λΨ(x)}. We recall general results on the topic by the help of an epigraphical projection. Then we deal with the special setting Ψ: = ‖L · ‖ with L ∈ Rm,n and Φ: = ϕ(H ·), where H ∈ Rn,n and ϕ: Rn → R∪{+∞} meet certain requirements which are often fulfilled in image process-ing models. In this case we prove by incorporating the dual problems that there exists a bijective function such that the solutions of the constrained problem coincide with those of the penalized problem if and only if τ and λ are in the graph of this function. We illustrate the relation between τ and λ for various problems arising in image processing. In particular, we point out the relation to the Pareto frontier for joint sparsity problems. We demonstrate the performance of the constrained model in restoration tasks of images corrupted by Poisson noise with the I-divergence as data fitting term ϕ and in inpainting models with the constrained nuclear norm. Such models can be useful if we have a priori knowledge on the image rather than on the noise level.
An Iterative Linear Expansion of Thresholds for 1-Based Image Restoration
"... Abstract — This paper proposes a novel algorithmic framework to solve image restoration problems under sparsity assumptions. As usual, the reconstructed image is the minimum of an objective functional that consists of a data fidelity term and an 1 regularization. However, instead of estimating the r ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract — This paper proposes a novel algorithmic framework to solve image restoration problems under sparsity assumptions. As usual, the reconstructed image is the minimum of an objective functional that consists of a data fidelity term and an 1 regularization. However, instead of estimating the reconstructed image that minimizes the objective functional directly, we focus on the restoration process that maps the degraded measurements to the reconstruction. Our idea amounts to parameterize the process as a linear combination of few elementary thresholding functions (LET) and to solve the linear weighting coefficients by minimizing the objective functional. It is then possible to update the thresholding functions and to iterate this process (i-LET). The key advantage of such a linear parametrization is that the problem size reduces dramatically—each time we only need to solve an optimization problem over the dimension of the linear coefficients (typically less than 10) instead of the whole image dimension. With the elementary thresholding functions satisfying certain constraints, a global convergence of the iterated LET algorithm is guaranteed. Experiments on several test images over a wide range of noise levels and different types of convolution kernels clearly indicate that the proposed framework usually outperforms state-of-the-art algorithms in terms of both the CPU time and the number of iterations. Index Terms — Image restoration, sparsity, majorization min-imization (MM), iterative reweighted least square (IRLS), thresholding, linear expansion of thresholds (LET).
SUBMITTED TO IEEE TRANSACTIONS ON IMAGE PROCESSING 1 An Iterative Linear Expansion of Thresholds for `1-based Image Restoration
"... Abstract—This paper proposes a novel algorithmic framework to solve image restoration problems under sparsity assumptions. As usual, the reconstructed image is the minimum of an objective functional that consists of a data fidelity term and an `1 regularization. However, instead of estimating the re ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract—This paper proposes a novel algorithmic framework to solve image restoration problems under sparsity assumptions. As usual, the reconstructed image is the minimum of an objective functional that consists of a data fidelity term and an `1 regularization. However, instead of estimating the reconstructed image that minimizes the objective functional directly, we focus on the restoration process that maps the degraded measurements to the reconstruction. Our idea amounts to parameterizing the process as a linear combination of few elementary thresholding functions (LET) and solve for the linear weighting coefficients by minimizing the objective functional. It is then possible to update the thresholding functions and to iterate this process (i-LET). The key advantage of such a linear parametrization is that the problem size reduces dramatically—each time we only need to solve an optimization problem over the dimension of the linear coefficients (typically less than 10) instead of the whole image dimension. With the elementary thresholding functions satisfying certain constraints, global convergence of the iterated LET algorithm is guaranteed. Experiments on several test images over a wide range of noise levels and different types of convolution kernels clearly indicate that the proposed framework usually outperform state-of-the-art algorithms in terms of both CPU time and number of iterations. Index Terms—Image restoration, sparsity, majorization mini-mization (MM), iterative reweighted least square (IRLS), thresh-olding, linear expansion of thresholds (LET).