Results 1 
8 of
8
Iterative kernel principal component analysis for image modeling
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2005
"... Abstract In recent years, Kernel Principal Component Analysis (KPCA) has been suggested for various image processing tasks requiring an image model such as, e.g., denoising or compression. The original form of KPCA, however, can be only applied to strongly restricted image classes due to the limite ..."
Abstract

Cited by 57 (3 self)
 Add to MetaCart
(Show Context)
Abstract In recent years, Kernel Principal Component Analysis (KPCA) has been suggested for various image processing tasks requiring an image model such as, e.g., denoising or compression. The original form of KPCA, however, can be only applied to strongly restricted image classes due to the limited number of training examples that can be processed. We therefore propose a new iterative method for performing KPCA, the Kernel Hebbian Algorithm which iteratively estimates the Kernel Principal Components with only linear order memory complexity. In our experiments, we compute models for complex image classes such as faces and natural images which require a large number of training examples. The resulting image models are tested in singleframe superresolution and denoising applications. The KPCA model is not specifically tailored to these tasks; in fact, the same model can be used in superresolution with variable input resolution, or denoising with unknown noise characteristics. In spite of this, both superresolution and denoising performance are comparable to existing methods.
Discretizationinvariant Bayesian inversion and Besov space priors
, 901
"... Abstract.Bayesian solution of an inverse problem for indirect measurement M = AU +E is considered, where U is a function on a domain of Rd. Here A is a smoothing linear operator and E is Gaussian white noise. The data is a realization mk of the random variable Mk = PkAU + PkE, where Pk is a linear, ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
Abstract.Bayesian solution of an inverse problem for indirect measurement M = AU +E is considered, where U is a function on a domain of Rd. Here A is a smoothing linear operator and E is Gaussian white noise. The data is a realization mk of the random variable Mk = PkAU + PkE, where Pk is a linear, finite dimensional operator related to measurement device. To allow computerized inversion, the unknown is discretized as Un = TnU, where Tn is a finite dimensional projection, leading to the computational measurement model Mkn = PkAUn + PkE. Bayes formula gives then the posterior distribution πkn(un  mkn) ∼ Πn(un) exp( − 1 2‖mkn − PkAun‖2 2) in Rd, and the mean ukn: = ∫ un πkn(un  mk) dun is considered as the reconstruction of U. We discuss a systematic way of choosing prior distributions Πn for all n ≥ n0> 0 by achieving them as projections of a distribution in a infinitedimensional limit case. Such choice of prior distributions is discretizationinvariant in the sense that Πn represent the same a priori information for all n and that the mean ukn converges to a limit estimate as k, n → ∞. Gaussian smoothness priors and waveletbased Besov space priors are shown to be discretization invariant. In particular, Bayesian inversion in dimension two with B1 11 prior is related to penalizing the ℓ1 norm of the wavelet coefficients of U.
Supervised Classification for Textured Images
, 2002
"... In this report, we present a supervised classification model based on a variational approach. This model is specifically devoted to textured images. We want to get an optimal partition of an image which is composed of textures separated by regular interfaces. To reach this goal, we represent the reg ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
In this report, we present a supervised classification model based on a variational approach. This model is specifically devoted to textured images. We want to get an optimal partition of an image which is composed of textures separated by regular interfaces. To reach this goal, we represent the regions defined by the classes as well as their interfaces by level set functions. We define a functional on these level sets whose minimizers define an optimal partition. In particular, this functional owns a data term specific to textures. We use a packet wavelet transform to analyze the textures, these ones being characterized by their energy distribution in each subband of the decomposition. The partial differential equations (PDE) related to the minimization of the functional are embeded in a dynamical scheme. Given an initial interface set (zero level set), the different terms of the PDE's govern the motion of interfaces such that, at convergence, we get an optimal partition as defined above. Each interface is guided by external forces (regularity of the interface), and internal ones (data term and partition constraints). We have conducted several experiments on both synthetic and real images.
pp. X–XX DISCRETIZATIONINVARIANT BAYESIAN INVERSION AND BESOV SPACE PRIORS
"... Abstract. Bayesian solution of an inverse problem for indirect measurement M = AU + E is considered, where U is a function on a domain of Rd. Here A is a smoothing linear operator and E is Gaussian white noise. The data is a realization mk of the random variable Mk = PkAU + PkE, where Pk is a linear ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Bayesian solution of an inverse problem for indirect measurement M = AU + E is considered, where U is a function on a domain of Rd. Here A is a smoothing linear operator and E is Gaussian white noise. The data is a realization mk of the random variable Mk = PkAU + PkE, where Pk is a linear, finite dimensional operator related to measurement device. To allow computerized inversion, the unknown is discretized as Un = TnU, where Tn is a finite dimensional projection, leading to the computational measurement model Mkn = PkAUn + PkE. Bayes formula gives then the posterior distribution πkn(un  mkn) ∼ Πn(un) exp( − 1 2 ‖mkn − PkAun‖2 2) in Rd, and the mean ukn: = R un πkn(un  mk) dun is considered as the reconstruction of U. We discuss a systematic way of choosing prior distributions Πn for all n ≥ n0> 0 by achieving them as projections of a distribution in a infinitedimensional limit case. Such choice of prior distributions is discretizationinvariant in the sense that Πn represent the same a priori information for all n and that the mean ukn converges to a limit estimate as k, n → ∞. Gaussian smoothness priors and waveletbased Besov space priors are shown to be discretization invariant. In particular, Bayesian inversion in dimension two with B1 11 prior is related to penalizing the ℓ1 norm of the wavelet coefficients of U.
On the Equivalence between SetTheoretic and Maxent MAP Estimation
, 2002
"... In this paper, we establish an equivalence between two conceptually di#erent methods of signal estimation under modeling uncertainty viz. settheoretic estimation and maximum entropy (maxent) MAP estimation. The first method assumes constraints on the signal to be estimated, the second assumes c ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper, we establish an equivalence between two conceptually di#erent methods of signal estimation under modeling uncertainty viz. settheoretic estimation and maximum entropy (maxent) MAP estimation. The first method assumes constraints on the signal to be estimated, the second assumes constraints on a probability distribution for the signal. We provide broad conditions under which the two aforementioned estimation paradigms produce the same signal estimate. We also show how the maxent formalism can be used to provide solutions to three important problems: how to select sizes of constraint sets in settheoretic estimation (the analysis highlights the role of shrinkage); how to choose the values of parameters in regularized restoration when using multiple regularization functionals; and how to trade o# model complexity and goodness of fit in a model selection problem.
Received (Day Month Year)
"... Communicated by (xxxxxxxxxx) Image restoration problems can naturally be cast as constrained convex programming problems in which the constraints arise from a priori information and the observation of signals physically related to the image to be recovered. In this paper, the focus is placed on the ..."
Abstract
 Add to MetaCart
Communicated by (xxxxxxxxxx) Image restoration problems can naturally be cast as constrained convex programming problems in which the constraints arise from a priori information and the observation of signals physically related to the image to be recovered. In this paper, the focus is placed on the construction of constraints based on wavelet representations. Using a mix of statistical and convexanalytical tools, we propose a general framework to construct waveletbased constraints. The resulting optimization problem is then solved with a blockiterative parallel algorithm which offers great flexibility in terms of implementation. Numerical results illustrate an application of the proposed framework.
JeanFrançois AUJOL
"... Waveletbased level set evolution for classification of textured images ..."
(Show Context)