Results 1 
9 of
9
A Tour of Modern Image Filtering  New insights and methods, both practical and theoretical
 IEEE SIGNAL PROCESSING MAGAZINE [106]
, 2013
"... Recent developments in computational imaging and restoration have heralded the arrival and convergence of several powerful methods for adaptive processing of multidimensional data. Examples include moving least square (from graphics), the bilateral filter (BF) and anisotropic diffusion (from compute ..."
Abstract

Cited by 27 (2 self)
 Add to MetaCart
Recent developments in computational imaging and restoration have heralded the arrival and convergence of several powerful methods for adaptive processing of multidimensional data. Examples include moving least square (from graphics), the bilateral filter (BF) and anisotropic diffusion (from computer vision), boosting, kernel, and spectral methods (from machine learning), nonlocal means (NLM) and its variants (from signal processing), Bregman iterations (from applied math), kernel regression, and iterative scaling (from statistics). While these approaches found their inspirations in diverse fields of nascence, they are deeply connected. Digital Object Identifier 10.1109/MSP.2011.2179329 Date of publication: 5 December 2012 In this article, I present a practical and accessible framework to understand some of the basic underpinnings of these methods, with the intention of leading the reader to a broad understanding of how they interrelate. I also illustrate connections between these techniques and more classical (empirical) Bayesian approaches. The proposed framework is used to arrive at new insights and methods, both practical and theoretical. In particular, several novel optimality properties of algorithms in wide use such as blockmatching and threedimensional (3D) filtering (BM3D), and methods for their iterative improvement (or nonexistence thereof) are discussed. A general approach is laid out to enable the performance analysis and subsequent improvement of many existing filtering algorithms. While much of the material discussed is applicable to the wider class of linear degradation models beyond noise (e.g., blur,) to keep matters focused, we consider the problem of denoising here.
Global image denoising
 IEEE Trans. on Image Proc
, 2014
"... Abstract — Most existing stateoftheart image denoising algorithms are based on exploiting similarity between a relatively modest number of patches. These patchbased methods are strictly dependent on patch matching, and their performance is hamstrung by the ability to reliably find sufficiently ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
Abstract — Most existing stateoftheart image denoising algorithms are based on exploiting similarity between a relatively modest number of patches. These patchbased methods are strictly dependent on patch matching, and their performance is hamstrung by the ability to reliably find sufficiently similar patches. As the number of patches grows, a point of diminishing returns is reached where the performance improvement due to more patches is offset by the lower likelihood of finding sufficiently close matches. The net effect is that while patchbased methods, such as BM3D, are excellent overall, they are ultimately limited in how well they can do on (larger) images with increasing complexity. In this paper, we address these shortcomings by developing a paradigm for truly global filtering where each pixel is estimated from all pixels in the image. Our objectives in this paper are twofold. First, we give a statistical analysis of our proposed global filter, based on a spectral decomposition of its corresponding operator, and we study the effect of truncation of this spectral decomposition. Second, we derive an approximation to the spectral (principal) components using the Nyström extension. Using these, we demonstrate that this global filter can be implemented efficiently by sampling a fairly small percentage of the pixels in the image. Experiments illustrate that our strategy can effectively globalize any existing denoising filters to estimate each pixel using all pixels in the image, hence improving upon the best patchbased methods. Index Terms — Image denoising, nonlocal filters, Nyström extension, spatial domain filter, risk estimator.
How to SAIFly Boost Denoising Performance
, 2012
"... Spatial domain image filters (e.g. bilateral filter, NLM, LARK) have achieved great success in denoising. However, their overall performance has not generally surpassed the leading transform domain based filters (such as BM3D). One important reason is that spatial domain filters lack an efficient wa ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Spatial domain image filters (e.g. bilateral filter, NLM, LARK) have achieved great success in denoising. However, their overall performance has not generally surpassed the leading transform domain based filters (such as BM3D). One important reason is that spatial domain filters lack an efficient way to adaptively fine tune their denoising strength; something that is relatively easy to do in transform domain method with shrinkage operators. In the pixel domain, the smoothing strength is usually controlled globally by, for example, tuning a regularization parameter. In this paper, we propose SAIF 1 (Spatially Adaptive Iterative Filtering), a new strategy to control the denoising strength locally for any spatial domain method. This approach is capable of filtering local image content iteratively using the given base filter, while the type of iteration and the iteration number are automatically optimized with respect to estimated risk (i.e. meansquared error). In exploiting the estimated local SNR, we also present a new risk estimator which is different than the oftenemployed SURE method and exceeds its performance in many cases. Experiments illustrate that our strategy can significantly relax the base algorithm’s sensitivity to its tuning (smoothing) parameters, and effectively boost the performance of several existing denoising filters to generate stateoftheart results under both simulated and practical conditions.
A twostage denoising filter: the preprocessed Yaroslavsky filter ∗
, 2012
"... This paper describes a simple image noise removal method which combines a preprocessing step with the Yaroslavsky filter for strong numerical, visual, and theoretical performance on a broad class of images. The framework developed is a twostage approach. In the first stage the image is filtered wit ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
This paper describes a simple image noise removal method which combines a preprocessing step with the Yaroslavsky filter for strong numerical, visual, and theoretical performance on a broad class of images. The framework developed is a twostage approach. In the first stage the image is filtered with a classical denoising method (e.g., wavelet or curvelet thresholding). In the second stage a modification of the Yaroslavsky filter is performed on the original noisy image, where the weights of the filters are governed by pixel similarities in the denoised image from the first stage. Similar prefiltering ideas have proved effective previously in the literature, and this paper provides theoretical guarantees and important insight into why prefiltering can be effective. Empirically, this simple approach achieves very good performance for cartoon images, and can be computed much more quickly than current patchbased denoising algorithms.
Global Image Denoising 1
, 2013
"... Most existing stateoftheart image denoising algorithms are based on exploiting similarity between a relatively modest number of patches. These patchbased methods are strictly dependent on patch matching, and their performance is hamstrung by the ability to reliably find sufficiently similar patc ..."
Abstract
 Add to MetaCart
(Show Context)
Most existing stateoftheart image denoising algorithms are based on exploiting similarity between a relatively modest number of patches. These patchbased methods are strictly dependent on patch matching, and their performance is hamstrung by the ability to reliably find sufficiently similar patches. As the number of patches grows, a point of diminishing returns is reached where the performance improvement due to more patches is offset by the lower likelihood of finding sufficiently close matches. The net effect is that while patchbased methods such as BM3D are excellent overall, they are ultimately limited in how well they can do on (larger) images with increasing complexity. In this work, we address these shortcomings by developing a paradigm for truly global filtering where each pixel is estimated from all pixels in the image. Our objectives in this paper are twofold. First, we give a statistical analysis of our proposed global filter, based on a spectral decomposition of its corresponding operator, and we study the effect of truncation of this spectral decomposition. Second, we derive an approximation to the spectral (principal) components using the Nyström extension. Using these, we demonstrate that this global filter can be implemented efficiently by sampling a fairly small percentage of the pixels in the image. Experiments illustrate that our strategy can effectively globalize any existing denoising filters to estimate each pixel using all pixels in the image, hence improving upon the best patchbased methods.
Fast BilateralSpace Stereo for Synthetic Defocus
"... Given a stereo pair it is possible to recover a depth map and use that depth to render a synthetically defocused image. Though stereo algorithms are wellstudied, rarely are those algorithms considered solely in the context of producing these defocused renderings. In this paper we present a techni ..."
Abstract
 Add to MetaCart
(Show Context)
Given a stereo pair it is possible to recover a depth map and use that depth to render a synthetically defocused image. Though stereo algorithms are wellstudied, rarely are those algorithms considered solely in the context of producing these defocused renderings. In this paper we present a technique for efficiently producing disparity maps using a novel optimization framework in which inference is performed in “bilateralspace”. Our approach produces higherquality “defocus ” results than other stereo algorithms while also being 10 − 100 × faster than comparable techniques. 1.
Nonlocal Image Editing
"... Abstract — In this paper, we introduce a new image editing tool based on the spectrum of a global filter computed from image affinities. Recently, it has been shown that the global filter derived from a fully connected graph representing the image can be approximated using the Nyström extension. Thi ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — In this paper, we introduce a new image editing tool based on the spectrum of a global filter computed from image affinities. Recently, it has been shown that the global filter derived from a fully connected graph representing the image can be approximated using the Nyström extension. This filter is computed by approximating the leading eigenvectors of the filter. These orthonormal eigenfunctions are highly expressive of the coarse and fine details in the underlying image, where each eigenvector can be interpreted as one scale of a datadependent multiscale image decomposition. In this filtering scheme, each eigenvalue can boost or suppress the corresponding signal component in each scale. Our analysis shows that the mapping of the eigenvalues by an appropriate polynomial function endows the filter with a number of important capabilities, such as edgeaware sharpening, denoising, tone manipulation, and abstraction, to name a few. Furthermore, the edits can be easily propagated across the image. Index Terms — Image editing, nonlocal filters, Nyström extension.
Motion Deblurring With Graph Laplacian Regularization
"... In this paper, we develop a regularization framework for image deblurring based on a new definition of the normalized graph Laplacian. We apply a fast scaling algorithm to the kernel similarity matrix to derive the symmetric, doubly stochastic filtering matrix from which the normalized Laplacian mat ..."
Abstract
 Add to MetaCart
In this paper, we develop a regularization framework for image deblurring based on a new definition of the normalized graph Laplacian. We apply a fast scaling algorithm to the kernel similarity matrix to derive the symmetric, doubly stochastic filtering matrix from which the normalized Laplacian matrix is built. We use this new definition of the Laplacian to construct a cost function consisting of data fidelity and regularization terms to solve the illposed motion deblurring problem. The final estimate is obtained by minimizing the resulting cost function in an iterative manner. Furthermore, the spectral properties of the Laplacian matrix equip us with the required tools for spectral analysis of the proposed method. We verify the effectiveness of our iterative algorithm via synthetic and real examples.