Results 1  10
of
368
Nonlinear total variation based noise removal algorithms
, 1992
"... A constrained optimization type of numerical algorithm for removing noise from images is presented. The total variation of the image is minimized subject to constraints involving the statistics of the noise. The constraints are imposed using Lagrange multipliers. The solution is obtained using the g ..."
Abstract

Cited by 2271 (51 self)
 Add to MetaCart
A constrained optimization type of numerical algorithm for removing noise from images is presented. The total variation of the image is minimized subject to constraints involving the statistics of the noise. The constraints are imposed using Lagrange multipliers. The solution is obtained using the gradientprojection method. This amounts to solving a time dependent partial differential equation on a manifold determined by the constraints. As t ~ 0o the solution converges to a steady state which is the denoised image. The numerical algorithm is simple and relatively fast. The results appear to be stateoftheart for very noisy images. The method is noninvasive, yielding sharp edges in the image. The technique could be interpreted as a first step of moving each level set of the image normal to itself with velocity equal to the curvature of the level set divided by the magnitude of the gradient of the image, and a second step which projects the image back onto the constraint set.
Robust Anisotropic Diffusion
, 1998
"... Relations between anisotropic diffusion and robust statistics are described in this paper. Specifically, we show that anisotropic diffusion can be seen as a robust estimation procedure that estimates a piecewise smooth image from a noisy input image. The "edgestopping" function in the ani ..."
Abstract

Cited by 361 (17 self)
 Add to MetaCart
(Show Context)
Relations between anisotropic diffusion and robust statistics are described in this paper. Specifically, we show that anisotropic diffusion can be seen as a robust estimation procedure that estimates a piecewise smooth image from a noisy input image. The "edgestopping" function in the anisotropic diffusion equation is closely related to the error norm and influence function in the robust estimation framework. This connection leads to a new "edgestopping" function based on Tukey's biweight robust estimator, that preserves sharper boundaries than previous formulations and improves the automatic stopping of the diffusion. The robust statistical interpretation also provides a means for detecting the boundaries (edges) between the piecewise smooth regions in an image that has been smoothed with anisotropic diffusion. Additionally, we derive a relationship between anisotropic diffusion and regularization with line processes. Adding constraints on the spatial organization of the ...
Removing camera shake from a single photograph
 ACM Trans. Graph
, 2006
"... Camera shake during exposure leads to objectionable image blur and ruins many photographs. Conventional blind deconvolution methods typically assume frequencydomain constraints on images, or overly simplified parametric forms for the motion path during camera shake. Real camera motions can follow c ..."
Abstract

Cited by 325 (16 self)
 Add to MetaCart
Camera shake during exposure leads to objectionable image blur and ruins many photographs. Conventional blind deconvolution methods typically assume frequencydomain constraints on images, or overly simplified parametric forms for the motion path during camera shake. Real camera motions can follow convoluted paths, and a spatial domain prior can better maintain visually salient image characteristics. We introduce a method to remove the effects of camera shake from seriously blurred images. The method assumes a uniform camera blur over the image and negligible inplane camera rotation. In order to estimate the blur from the camera shake, the user must specify an image region without saturation effects. We show results for a variety of digital photographs taken from personal photo collections.
Deterministic edgepreserving regularization in computed imaging
 IEEE Trans. Image Processing
, 1997
"... Abstract—Many image processing problems are ill posed and must be regularized. Usually, a roughness penalty is imposed on the solution. The difficulty is to avoid the smoothing of edges, which are very important attributes of the image. In this paper, we first give conditions for the design of such ..."
Abstract

Cited by 311 (27 self)
 Add to MetaCart
(Show Context)
Abstract—Many image processing problems are ill posed and must be regularized. Usually, a roughness penalty is imposed on the solution. The difficulty is to avoid the smoothing of edges, which are very important attributes of the image. In this paper, we first give conditions for the design of such an edgepreserving regularization. Under these conditions, we show that it is possible to introduce an auxiliary variable whose role is twofold. First, it marks the discontinuities and ensures their preservation from smoothing. Second, it makes the criterion halfquadratic. The optimization is then easier. We propose a deterministic strategy, based on alternate minimizations on the image and the auxiliary variable. This leads to the definition of an original reconstruction algorithm, called ARTUR. Some theoretical properties of ARTUR are discussed. Experimental results illustrate the behavior of the algorithm. These results are shown in the field of tomography, but this method can be applied in a large number of applications in image processing. I.
A generalized Gaussian image model for edgepreserving MAP estimation
 IEEE Trans. on Image Processing
, 1993
"... Absfrucf We present a Markov random field model which allows realistic edge modeling while providing stable maximum a posteriori MAP solutions. The proposed model, which we refer to as a generalized Gaussian Markov random field (GGMRF), is named for its similarity to the generalized Gaussian distri ..."
Abstract

Cited by 301 (37 self)
 Add to MetaCart
(Show Context)
Absfrucf We present a Markov random field model which allows realistic edge modeling while providing stable maximum a posteriori MAP solutions. The proposed model, which we refer to as a generalized Gaussian Markov random field (GGMRF), is named for its similarity to the generalized Gaussian distribution used in robust detection and estimation. The model satisifies several desirable analytical and computational properties for MAP estimation, including continuous dependence of the estimate on the data, invariance of the character of solutions to scaling of data, and a solution which lies at the unique global minimum of the U posteriori loglikeihood function. The GGMRF is demonstrated to be useful for image reconstruction in lowdosage transmission tomography. I.
Fields of experts: A framework for learning image priors
 In CVPR
, 2005
"... We develop a framework for learning generic, expressive image priors that capture the statistics of natural scenes and can be used for a variety of machine vision tasks. The approach extends traditional Markov Random Field (MRF) models by learning potential functions over extended pixel neighborhood ..."
Abstract

Cited by 292 (4 self)
 Add to MetaCart
(Show Context)
We develop a framework for learning generic, expressive image priors that capture the statistics of natural scenes and can be used for a variety of machine vision tasks. The approach extends traditional Markov Random Field (MRF) models by learning potential functions over extended pixel neighborhoods. Field potentials are modeled using a ProductsofExperts framework that exploits nonlinear functions of many linear filter responses. In contrast to previous MRF approaches all parameters, including the linear filters themselves, are learned from training data. We demonstrate the capabilities of this Field of Experts model with two example applications, image denoising and image inpainting, which are implemented using a simple, approximate inference scheme. While the model is trained on a generic image database and is not tuned toward a specific application, we obtain results that compete with and even outperform specialized techniques. 1.
On the Unification Line Processes, Outlier Rejection, and Robust Statistics with Applications in Early Vision
, 1996
"... The modeling of spatial discontinuities for problems such as surface recovery, segmentation, image reconstruction, and optical flow has been intensely studied in computer vision. While "lineprocess" models of discontinuities have received a great deal of attention, there has been recent ..."
Abstract

Cited by 271 (8 self)
 Add to MetaCart
The modeling of spatial discontinuities for problems such as surface recovery, segmentation, image reconstruction, and optical flow has been intensely studied in computer vision. While "lineprocess" models of discontinuities have received a great deal of attention, there has been recent interest in the use of robust statistical techniques to account for discontinuities. This paper unifies the two approaches. To achieve this we generalize the notion of a "line process" to that of an analog "outlier process" and show how a problem formulated in terms of outlier processes can be viewed in terms of robust statistics. We also characterize a class of robust statistical problems for which an equivalent outlierprocess formulation exists and give a straightforward method for converting a robust estimation problem into an outlierprocess formulation. We show how prior assumptions about the spatial structure of outliers can be expressed as constraints on the recovered analog outlier processes and how traditional continuation methods can be extended to the explicit outlierprocess formulation. These results indicate that the outlierprocess approach provides a general framework which subsumes the traditional lineprocess approaches as well as a wide class of robust estimation problems. Examples in surface reconstruction, image segmentation, and optical flow are presented to illustrate the use of outlier processes and to show how the relationship between outlier processes and robust statistics can be exploited. An appendix provides a catalog of common robust error norms and their equivalent outlierprocess formulations.
A new alternating minimization algorithm for total variation image reconstruction
 SIAM J. IMAGING SCI
, 2008
"... We propose, analyze and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new halfquadratic model applicable to not only the anisotropic but also isotropic forms of total variati ..."
Abstract

Cited by 224 (26 self)
 Add to MetaCart
(Show Context)
We propose, analyze and test an alternating minimization algorithm for recovering images from blurry and noisy observations with total variation (TV) regularization. This algorithm arises from a new halfquadratic model applicable to not only the anisotropic but also isotropic forms of total variation discretizations. The periteration computational complexity of the algorithm is three Fast Fourier Transforms (FFTs). We establish strong convergence properties for the algorithm including finite convergence for some variables and relatively fast exponential (or qlinear in optimization terminology) convergence for the others. Furthermore, we propose a continuation scheme to accelerate the practical convergence of the algorithm. Extensive numerical results show that our algorithm performs favorably in comparison to several stateoftheart algorithms. In particular, it runs orders of magnitude faster than the Lagged Diffusivity algorithm for totalvariationbased deblurring. Some extensions of our algorithm are also discussed.
Nonlinear Image Recovery with HalfQuadratic Regularization
, 1993
"... One popular method for the recovery of an ideal intensity image from corrupted or indirect measurements is regularization: minimize an objective function which enforces a roughness penalty in addition to coherence with the data. Linear estimates are relatively easy to compute but generally introduce ..."
Abstract

Cited by 209 (0 self)
 Add to MetaCart
One popular method for the recovery of an ideal intensity image from corrupted or indirect measurements is regularization: minimize an objective function which enforces a roughness penalty in addition to coherence with the data. Linear estimates are relatively easy to compute but generally introduce systematic errors; for example, they are incapable of recovering discontinuities and other important image attributes. In contrast, nonlinear estimates are more accurate, but often far less accessible. This is particularly true when the objective function is nonconvex and the distribution of each data component depends on many image components through a linear operator with broad support. Our approach is based on an auxiliary array and an extended objective function in which the original variables appear quadratically and the auxiliary variables are decoupled. Minimizing over the auxiliary array alone yields the original function, so the original image estimate can be obtained by joint min...
A framework for the robust estimation of optical flow.
 In Proc. ICCV
, 1993
"... ..."
(Show Context)