Results 1  10
of
168
Proximal Splitting Methods in Signal Processing
"... The proximity operator of a convex function is a natural extension of the notion of a projection operator onto a convex set. This tool, which plays a central role in the analysis and the numerical solution of convex optimization problems, has recently been introduced in the arena of inverse problems ..."
Abstract

Cited by 266 (31 self)
 Add to MetaCart
(Show Context)
The proximity operator of a convex function is a natural extension of the notion of a projection operator onto a convex set. This tool, which plays a central role in the analysis and the numerical solution of convex optimization problems, has recently been introduced in the arena of inverse problems and, especially, in signal processing, where it has become increasingly important. In this paper, we review the basic properties of proximity operators which are relevant to signal processing and present optimization methods based on these operators. These proximal splitting methods are shown to capture and extend several wellknown algorithms in a unifying framework. Applications of proximal methods in signal recovery and synthesis are discussed.
Image deblurring and superresolution by adaptive sparse domain selection and adaptive regularization
 IEEE Trans. Image Process
, 2011
"... Abstract—As a powerful statistical image modeling technique, sparse representation has been successfully used in various image restoration applications. The success of sparse representation owes to the development of thenorm optimization techniques and the fact that natural images are intrinsically ..."
Abstract

Cited by 59 (11 self)
 Add to MetaCart
(Show Context)
Abstract—As a powerful statistical image modeling technique, sparse representation has been successfully used in various image restoration applications. The success of sparse representation owes to the development of thenorm optimization techniques and the fact that natural images are intrinsically sparse in some domains. The image restoration quality largely depends on whether the employed sparse domain can represent well the underlying image. Considering that the contents can vary significantly across different images or different patches in a single image, we propose to learn various sets of bases from a precollected dataset of example image patches, and then, for a given patch to be processed, one set of bases are adaptively selected to characterize the local sparse domain. We further introduce two adaptive regularization terms into the sparse representation framework. First, a set of autoregressive (AR) models are learned from the dataset of example image patches. The best fitted AR models to a given patch are adaptively selected to regularize the image local structures. Second, the image nonlocal selfsimilarity is introduced as another regularization term. In addition, the sparsity regularization parameter is adaptively estimated for better image restoration performance. Extensive experiments on image deblurring and superresolution validate that by using adaptive sparse domain selection and adaptive regularization, the proposed method achieves much better results than many stateoftheart algorithms in terms of both PSNR and visual perception. Index Terms—Deblurring, image restoration (IR), regularization, sparse representation, superresolution. I.
Deblurring Poissonian images by split Bregman techniques
"... The restoration of blurred images corrupted by Poisson noise is an important task in various applications such as astronomical imaging, electronic microscopy, single particle emission computed tomography (SPECT) and positron emission tomography (PET). In this paper, we focus on solving this task by ..."
Abstract

Cited by 50 (2 self)
 Add to MetaCart
The restoration of blurred images corrupted by Poisson noise is an important task in various applications such as astronomical imaging, electronic microscopy, single particle emission computed tomography (SPECT) and positron emission tomography (PET). In this paper, we focus on solving this task by minimizing an energy functional consisting of the Idivergence as similarity term and the TV regularization term. Our minimizing algorithm uses alternating split Bregman techniques (alternating direction method of multipliers) which can be reinterpreted as DouglasRachford splitting applied to the dual problem. In contrast to recently developed iterative algorithms, our algorithm contains no inner iterations and produces nonnegative images. The high efficiency of our algorithm in comparison to other recently developed algorithms to minimize the same functional is demonstrated by artificial and realworld numerical examples.
Signal Restoration with Overcomplete Wavelet Transforms: Comparison of Analysis and Synthesis Priors
"... The variational approach to signal restoration calls for the minimization of a cost function that is the sum of a data fidelity term and a regularization term, the latter term constituting a ‘prior’. A synthesis prior represents the sought signal as a weighted sum of ‘atoms’. On the other hand, an a ..."
Abstract

Cited by 47 (5 self)
 Add to MetaCart
The variational approach to signal restoration calls for the minimization of a cost function that is the sum of a data fidelity term and a regularization term, the latter term constituting a ‘prior’. A synthesis prior represents the sought signal as a weighted sum of ‘atoms’. On the other hand, an analysis prior models the coefficients obtained by applying the forward transform to the signal. For orthonormal transforms, the synthesis prior and analysis prior are equivalent; however, for overcomplete transforms the two formulations are different. We compare analysis and synthesis ℓ1norm regularization with overcomplete transforms for denoising and deconvolution.
An inverse power method for nonlinear eigenproblems with applications in 1spectral clustering and sparse pca
 In Advances in Neural Information Processing Systems 23 (NIPS 2010
, 2010
"... Many problems in machine learning and statistics can be formulated as (generalized) eigenproblems. In terms of the associated optimization problem, computing linear eigenvectors amounts to finding critical points of a quadratic function subject to quadratic constraints. In this paper we show that a ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
(Show Context)
Many problems in machine learning and statistics can be formulated as (generalized) eigenproblems. In terms of the associated optimization problem, computing linear eigenvectors amounts to finding critical points of a quadratic function subject to quadratic constraints. In this paper we show that a certain class of constrained optimization problems with nonquadratic objective and constraints can be understood as nonlinear eigenproblems. We derive a generalization of the inverse power method which is guaranteed to converge to a nonlinear eigenvector. We apply the inverse power method to 1spectral clustering and sparse PCA which can naturally be formulated as nonlinear eigenproblems. In both applications we achieve stateoftheart results in terms of solution quality and runtime. Moving beyond the standard eigenproblem should be useful also in many other applications and our inverse power method can be easily adapted to new problems. 1
A splittingbased iterative algorithm for accelerated statistical Xray CT reconstruction
 Medical Imaging, IEEE Transactions on
, 2012
"... Abstract—Statistical image reconstruction using penalized weighted leastsquares (PWLS) criteria can improve imagequality in Xray CT. However, the huge dynamic range of the statistical weights leads to a highly shiftvariant inverse problem making it difficult to precondition and accelerate existi ..."
Abstract

Cited by 27 (8 self)
 Add to MetaCart
(Show Context)
Abstract—Statistical image reconstruction using penalized weighted leastsquares (PWLS) criteria can improve imagequality in Xray CT. However, the huge dynamic range of the statistical weights leads to a highly shiftvariant inverse problem making it difficult to precondition and accelerate existing iterative algorithms that attack the statistical model directly. We propose to alleviate the problem by using a variablesplitting scheme that separates the shiftvariant and (“nearly”) invariant components of the statistical data model and also decouples the regularization term. This leads to an equivalent constrained problem that we tackle using the classical methodofmultipliers framework with alternating minimization. The specific form of our splitting yields an alternating direction method of multipliers (ADMM) algorithm with an innerstep involving a “nearly ” shiftinvariant linear system that is suitable for FFTbased preconditioning using conetype filters. The proposed method can efficiently handle a variety of convex regularization criteria including smooth edgepreserving regularizers and nonsmooth sparsitypromoting ones based on the ℓ1norm and total variation. Numerical experiments with synthetic and real in vivo human data illustrate that conefilter preconditioners accelerate the proposed ADMM resulting in fast convergence of ADMM compared to conventional (nonlinear conjugate gradient, ordered subsets) and stateoftheart (MFISTA, splitBregman) algorithms that are applicable for CT.
Nonlocally Centralized Sparse Representation for Image Restoration
, 2011
"... The sparse representation models code an image patch as a linear combination of a few atoms chosen out from an overcomplete dictionary, and they have shown promising results in various image restoration applications. However, due to the degradation of the observed image (e.g., noisy, blurred and/o ..."
Abstract

Cited by 25 (8 self)
 Add to MetaCart
(Show Context)
The sparse representation models code an image patch as a linear combination of a few atoms chosen out from an overcomplete dictionary, and they have shown promising results in various image restoration applications. However, due to the degradation of the observed image (e.g., noisy, blurred and/or downsampled), the sparse representations by conventional models may not be accurate enough for a faithful reconstruction of the original image. To improve the performance of sparse representation based image restoration, in this paper the concept of sparse coding noise is introduced, and the goal of image restoration turns to how to suppress the sparse coding noise. To this end, we exploit the image nonlocal selfsimilarity to obtain good estimates of the sparse coding coefficients of the original image, and then centralize the sparse coding coefficients of the observed image to those estimates. The socalled nonlocally centralized sparse representation (NCSR) model is as simple as the standard sparse representation model, while our extensive experiments on various types of image restoration problems, including denoising, deblurring and superresolution, validate the generality and stateoftheart performance of the proposed NCSR algorithm.
Infimal convolution regularizations with discrete l1type functionals
 Comm. Math. Sci
, 2011
"... Dedicated to Prof. Dr. Lothar Berg on the occasion of his 80th birthday ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
(Show Context)
Dedicated to Prof. Dr. Lothar Berg on the occasion of his 80th birthday
Efficient MR Image Reconstruction for Compressed MR Imaging
"... Abstract. In this paper, we propose an efficient algorithm for MR image reconstruction. The algorithm minimizes a linear combination of three terms corresponding to a least square data fitting, total variation (TV) and L1 norm regularization. This has been shown to be very powerful for the MR image ..."
Abstract

Cited by 24 (16 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper, we propose an efficient algorithm for MR image reconstruction. The algorithm minimizes a linear combination of three terms corresponding to a least square data fitting, total variation (TV) and L1 norm regularization. This has been shown to be very powerful for the MR image reconstruction. First, we decompose the original problem into L1 and TV norm regularization subproblems respectively. Then, these two subproblems are efficiently solved by existing techniques. Finally, the reconstructed image is obtained from the weighted average of solutions from two subproblems in an iterative framework. We compare the proposed algorithm with previous methods in term of the reconstruction accuracy and computation complexity. Numerous experiments demonstrate the superior performance of the proposed algorithm for compressed MR image reconstruction. 1
Total variation regularization for fMRIbased prediction of behavior
 Medical Imaging, IEEE Transactions on
, 2011
"... HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract

Cited by 23 (9 self)
 Add to MetaCart
(Show Context)
HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés. 1Total variation regularization for fMRIbased prediction of behaviour