Results 1  10
of
33
Image denoising: Can plain neural networks compete with BM3D?
, 2012
"... Image denoising can be described as the problem of mapping from a noisy image to a noisefree image. The best currently available denoising methods approximate this mapping with cleverly engineered algorithms. In this work we attempt to learn this mapping directly with a plain multi layer perceptron ..."
Abstract

Cited by 29 (4 self)
 Add to MetaCart
(Show Context)
Image denoising can be described as the problem of mapping from a noisy image to a noisefree image. The best currently available denoising methods approximate this mapping with cleverly engineered algorithms. In this work we attempt to learn this mapping directly with a plain multi layer perceptron (MLP) applied to image patches. While this has been done before, we will show that by training on large image databases we are able to compete with the current stateoftheart image denoising methods. Furthermore, our approach is easily adapted to less extensively studied types of noise (by merely exchanging the training data), for which we achieve excellent results as well.
Patch Complexity, Finite Pixel Correlations and Optimal Denoising
"... Abstract. Image restoration tasks are illposed problems, typically solved with priors. Since the optimal prior is the exact unknown density of natural images, actual priors are only approximate and typically restricted to small patches. This raises several questions: How much may we hope to improve ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Image restoration tasks are illposed problems, typically solved with priors. Since the optimal prior is the exact unknown density of natural images, actual priors are only approximate and typically restricted to small patches. This raises several questions: How much may we hope to improve current restoration results with future sophisticated algorithms? And more fundamentally, even with perfect knowledge of natural image statistics, what is the inherent ambiguity of the problem? In addition, since most current methods are limited to finite support patches or kernels, what is the relation between the patch complexity of natural images, patch size, and restoration errors? Focusing on image denoising, we make several contributions. First, in light of computational constraints, we study the relation between denoising gain and sample size requirements in a non parametric approach. We present a law of diminishing return, namely that with increasing patch size, rare patches not only require a much larger dataset, but also gain little from it. This result suggests novel adaptive variablesized patch schemes for denoising. Second, we study absolute denoising limits, regardless of the algorithm used, and the converge rate to them as a function of patch size. Scale invariance of natural images plays a key role here and implies both a strictly positive lower bound on denoising and a power law convergence. Extrapolating this parametric law gives a ballpark estimate of the best achievable denoising, suggesting that some improvement, although modest, is still possible. 1
1 A Nonlocal TransformDomain Filter for Volumetric Data Denoising
"... Abstract—We present an extension of the BM3D filter to volumetric data. The proposed algorithm, denominated BM4D, implements the grouping and collaborative filtering paradigm, where mutually similar ddimensional patches are stacked together in a (d + 1)dimensional array and jointly filtered in tra ..."
Abstract

Cited by 12 (3 self)
 Add to MetaCart
Abstract—We present an extension of the BM3D filter to volumetric data. The proposed algorithm, denominated BM4D, implements the grouping and collaborative filtering paradigm, where mutually similar ddimensional patches are stacked together in a (d + 1)dimensional array and jointly filtered in transform domain. While in BM3D the basic data patches are blocks of pixels, in BM4D we utilize cubes of voxels, which are stacked into a fourdimensional “group”. The fourdimensional transform applied on the group simultaneously exploits the local correlation present among voxels in each cube and the nonlocal correlation between the corresponding voxels of different cubes. Thus, the spectrum of the group is highly sparse, leading to very effective separation of signal and noise through coefficients shrinkage. After inverse transformation, we obtain estimates of each grouped cube, which are then adaptively aggregated at their original locations. We evaluate the algorithm on denoising of volumetric data corrupted by Gaussian and Rician noise, as well as on reconstruction of phantom data from sparse Fourier measurements. Experimental results demonstrate the stateoftheart denoising performance of BM4D, and its effectiveness when exploited as a regularizer in volumetric data reconstruction. Index Terms—Volumetric data denoising, volumetric data reconstruction, compressed sensing, magnetic resonance imaging, computed tomography, nonlocal methods, adaptive transforms I.
Symmetrizing Smoothing Filters
, 2013
"... We study a general class of nonlinear and shiftvarying smoothing filters that operate based on averaging. This important class of filters includes many wellknown examples such as the bilateral filter, nonlocal means, general adaptive moving average filters, and more. (Many linear filters such as ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
(Show Context)
We study a general class of nonlinear and shiftvarying smoothing filters that operate based on averaging. This important class of filters includes many wellknown examples such as the bilateral filter, nonlocal means, general adaptive moving average filters, and more. (Many linear filters such as linear minimum meansquared error smoothing filters, Savitzky–Golay filters, smoothing splines, and wavelet smoothers can be considered special cases.) They are frequently used in both signal and image processing as they are elegant, computationally simple, and high performing. The operators that implement such filters, however, are not symmetric in general. The main contribution of this paper is to provide a provably stable method for symmetrizing the smoothing operators. Specifically, we propose a novel approximation of smoothing operators by symmetric doubly stochastic matrices and show that this approximation is stable and accurate, even more so in higher dimensions. We demonstrate that there are several important advantages to this symmetrization, particularly in image processing/filtering applications such as denoising. In particular, (1) doubly stochastic filters generally lead to improved performance over the baseline smoothing procedure; (2) when the filters are applied iteratively, the symmetric ones can be guaranteed to lead to stable algorithms; and (3) symmetric smoothers allow an orthonormal eigendecomposition which enables us to peer into the complex behavior of such nonlinear and shiftvarying filters in a locally adapted basis using principal components. Finally, a doubly stochastic filter has a simple and intuitive interpretation. Namely, it implies the very natural property that every pixel in the given input image has the same sum total contribution to the output image.
Combining the Power of Internal and External Denoising
"... Image denoising methods can broadly be classified into two types: “Internal Denoising ” (denoising an image patch using other noisy patches within the noisy image), and “External Denoising ” (denoising a patch using external clean natural image patches). Any such method, whether Internal or External ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
Image denoising methods can broadly be classified into two types: “Internal Denoising ” (denoising an image patch using other noisy patches within the noisy image), and “External Denoising ” (denoising a patch using external clean natural image patches). Any such method, whether Internal or External, is typically applied to all image patches. In this paper we show that different image patches inherently have different preferences for Internal or External denoising. Moreover, and surprisingly, the higher the noise in the image, the stronger the preference for Internal Denoising. We identify and explain the source of this behavior, and show that Internal/External preference of a patch is directly related to its individual SignaltoNoiseRatio (“PatchSNR”). Patches with high PatchSNR (e.g., patches on strong edges) benefit much from External Denoising, whereas patches with low PatchSNR (e.g., patches in noisy uniform regions) benefit much more from Internal Denoising. Combining the power of Internal or External denoising selectively for each patch based on its estimated PatchSNR leads to improvement in denoising performance. 1.
How well do filterbased mrfs model natural images
 In: DAGM/OAGM Symposium
, 2012
"... Abstract. Markov random fields (MRFs) have found widespread use as models of natural image and scene statistics. Despite progress in modeling image properties beyond gradient statistics with highorder cliques, and learning image models from example data, existing MRFs only exhibit a limited ability ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
(Show Context)
Abstract. Markov random fields (MRFs) have found widespread use as models of natural image and scene statistics. Despite progress in modeling image properties beyond gradient statistics with highorder cliques, and learning image models from example data, existing MRFs only exhibit a limited ability of actually capturing natural image statistics. In this paper we investigate this limitation of previous filterbased MRF models, which appears in contradiction to their maximum entropy interpretation. We argue that this is due to inadequacies in the leaning procedure and suggest various modifications to address them. We demonstrate that the proposed learning scheme allows training more suitable potential functions, whose shape approaches that of a Diracdelta function, as well as models with larger and more filters. Our experiments not only indicate a substantial improvement of the models ’ ability to capture relevant statistical properties of natural images, but also demonstrate a significant performance increase in a denoising application to levels previously unattained by generative approaches. 1 Introduction and Related Work
PatchMatch: A Fast Randomized Matching Algorithm with Application to Image and Video
, 2011
"... This thesis presents a novel fast randomized matching algorithm for finding correspondences between small local regions of images. We also explore a wide variety of applications of this new fast randomized matching technique. The core matching algorithm, which we call PatchMatch, can find similar re ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
This thesis presents a novel fast randomized matching algorithm for finding correspondences between small local regions of images. We also explore a wide variety of applications of this new fast randomized matching technique. The core matching algorithm, which we call PatchMatch, can find similar regions or “patches ” of an image one to two orders of magnitude faster than previous techniques. The algorithm is motivated by statistical properties of nearest neighbors in natural images. We observe that neighboring correspondences tend to be similar or “coherent ” and use this observation in our algorithm in order to quickly converge to an approximate solution. Our algorithm in the most general form can find knearest neighbor matchings, using patches that translate, rotate, or scale, using arbitrary descriptors, and between two or more images. Speedups are obtained over alternative techniques in a number of these areas. We analyze convergence both empirically and theoretically for many of these image matching algorithms. We have explored many applications of this matching algorithm. In computer graphics, we have
An analysis and implementation of the BM3D image denoising method”. Image Processing On Line
, 2012
"... ISSN 2105–1232 c © 2012 IPOL & the authors CC–BY–NC–SA This article is available online with supplementary materials, software, datasets and online demo at ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
ISSN 2105–1232 c © 2012 IPOL & the authors CC–BY–NC–SA This article is available online with supplementary materials, software, datasets and online demo at
Image denoising with multilayer perceptrons, part 2: training tradeoffs and analysis of their mechanisms
 JOURNAL OF MACHINE LEARNING RESEARCH (JMLR)
, 2012
"... Image denoising can be described as the problem of mapping from a noisy image to a noisefree image. In Burger et al. (2012), we show that multilayer perceptrons can achieve outstanding image denoising performance for various types of noise (additive white Gaussian noise, mixed PoissonGaussian noi ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
Image denoising can be described as the problem of mapping from a noisy image to a noisefree image. In Burger et al. (2012), we show that multilayer perceptrons can achieve outstanding image denoising performance for various types of noise (additive white Gaussian noise, mixed PoissonGaussian noise, JPEG artifacts, saltandpepper noise and noise resembling stripes). In this work we discuss in detail which tradeoffs have to be considered during the training procedure. We will show how to achieve good results and which pitfalls to avoid. By analysing the activation patterns of the hidden units we are able to make observations regarding the functioning principle of multilayer perceptrons trained for image denoising.
Separating Signal from Noise using Patch Recurrence Across Scales
"... Recurrence of small clean image patches across different scales of a natural image has been successfully used for solving illposed problems in clean images (e.g., superresolution from a single image). In this paper we show how this multiscale property can be extended to solve illposed problems un ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Recurrence of small clean image patches across different scales of a natural image has been successfully used for solving illposed problems in clean images (e.g., superresolution from a single image). In this paper we show how this multiscale property can be extended to solve illposed problems under noisy conditions, such as image denoising. While clean patches are obscured by severe noise in the original scale of a noisy image, noise levels drop dramatically at coarser image scales. This allows for the unknown hidden clean patches to “naturally emerge ” in some coarser scale of the noisy image. We further show that patch recurrence across scales is strengthened when using directional pyramids (that blur and subsample only in one direction). Our statistical experiments show that for almost any noisy image patch (more than 99%), there exists a “good ” clean version of itself at the same relative image coordinates in some coarser scale of the image. This is a strong phenomenon of noisecontaminated natural images, which can serve as a strong prior for separating the signal from the noise. Finally, incorporating this multiscale prior into a simple denoising algorithm yields stateoftheart denoising results. 1.