Results 1  10
of
89
A double MetropolisHastings sampler for spatial models with intractable normalizing constants
 Journal of Statistical Computing and Simulation
"... The problem of simulating from distributions with intractable normalizing constants has received much attention in the recent literature. In this paper, we propose an asymptotic algorithm, the socalled double MetropolisHastings (MH) sampler, for tickling this problem. Unlike other auxiliary variabl ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
The problem of simulating from distributions with intractable normalizing constants has received much attention in the recent literature. In this paper, we propose an asymptotic algorithm, the socalled double MetropolisHastings (MH) sampler, for tickling this problem. Unlike other auxiliary variable algorithms, the double MH sampler removes the need of exact sampling, the auxiliary variables being generated using MH kernels, and thus can be applied to a wide range of problems for which exact sampling is not available. While for the problems for which exact sampling is available, it can typically produce the same accurate results as the exchange algorithm, but using much less CPU time. The new method is illustrated by various spatial models.
Bayesian Parameter Estimation for Latent Markov Random Fields and Social Networks
 Journal of Computational and Graphical Statistics
, 2012
"... Markov random fields and social networks ABACDEF ..."
(Show Context)
Bayesian inference in hidden Markov random fields for binary data defined on large lattices
, 2005
"... this paper is to introduce approximate methods to compute the likelihood for large lattices based on exact likelihood calculations for smaller lattices. We introduce approximate likelihood methods by relaxing some of the dependencies in the latent model, and also by approximating the likelihood by a ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
(Show Context)
this paper is to introduce approximate methods to compute the likelihood for large lattices based on exact likelihood calculations for smaller lattices. We introduce approximate likelihood methods by relaxing some of the dependencies in the latent model, and also by approximating the likelihood by a partially ordered Markov model defined on a collection of sublattices. Results are presented based on simulated data as well as inference for the temporalspatial structure of the interaction between up and downregulated states within the mitochondrial chromosome of the Plasmodium falciparum organism
Bayesian inference for exponential random graph models
 Social Networks
, 2011
"... Bayesian inference for exponential random graph models Exponential random graph models are extremely difficult models to handle from a statistical viewpoint, since their normalising constant, which depends on model parameters, is available only in very trivial cases. We show how inference can be ca ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
Bayesian inference for exponential random graph models Exponential random graph models are extremely difficult models to handle from a statistical viewpoint, since their normalising constant, which depends on model parameters, is available only in very trivial cases. We show how inference can be carried out in a Bayesian framework using a MCMC algorithm, which circumvents the need to calculate the normalising constants. We use a population MCMC approach which accelerates convergence and improves mixing of the Markov chain. This approach improves performance with respect to the Monte Carlo maximum likelihood method of Geyer and Thompson (1992). 1
Variational Bayes for estimating the parameters of a hidden Potts model
 Stat. Comput
, 2009
"... Hidden Markov random field models provide an appealing representation of images and other spatial problems. The drawback is that inference is not straightforward for these models as the normalisation constant for the likelihood is generally intractable except for very small observation sets. Variati ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
Hidden Markov random field models provide an appealing representation of images and other spatial problems. The drawback is that inference is not straightforward for these models as the normalisation constant for the likelihood is generally intractable except for very small observation sets. Variational methods are an emerging tool for Bayesian inference and they have already been successfully applied in other contexts. Focusing on the particular case of a hidden Potts model with Gaussian noise, we show how variational Bayesian methods can be applied to hidden Markov random field inference. To tackle the obstacle of the intractable normalising constant for the likelihood, we explore alternative estimation approaches for incorporation into the variational Bayes algorithm. We consider a pseudolikelihood approach as well as the more recent reduced dependence approximation of the normalisation constant. To illustrate the effectiveness of these approaches we present empirical results from the analysis of simulated datasets. We also analyse a real dataset and compare results with those of previous analyses as well as those obtained from the recently developed auxiliary variable MCMC method and the recursive MCMC method. Our results show that the variational Bayesian analyses can be carried out much faster than the MCMC analyses and produce good estimates of model parameters. We also found that the reduced dependence approximation of the normalisation constant outperformed the pseudolikelihood approximation in our analysis of real and synthetic datasets.
The Gaussian Process Density Sampler
"... We present the Gaussian Process Density Sampler (GPDS), an exchangeable generative model for use in nonparametric Bayesian density estimation. Samples drawn from the GPDS are consistent with exact, independent samples from a fixed density function that is a transformation of a function drawn from a ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
(Show Context)
We present the Gaussian Process Density Sampler (GPDS), an exchangeable generative model for use in nonparametric Bayesian density estimation. Samples drawn from the GPDS are consistent with exact, independent samples from a fixed density function that is a transformation of a function drawn from a Gaussian process prior. Our formulation allows us to infer an unknown density from data using Markov chain Monte Carlo, which gives samples from the posterior distribution over density functions and from the predictive distribution on data space. We can also infer the hyperparameters of the Gaussian process. We compare this density modeling technique to several existing techniques on a toy problem and a skullreconstruction task. 1
A Bayesian reassessment of nearestneighbor classification
 J. of the Am. Stat. Association
, 2009
"... HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte p ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
HAL is a multidisciplinary open access archive for the deposit and dissemination of scientific research documents, whether they are published or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et a ̀ la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés. appor t de r ech er ch e
Bayesian random fields: The BetheLaplace approximation
 In ICML
, 2006
"... While learning the maximum likelihood value of parameters of an undirected graphical model is hard, modelling the posterior distribution over parameters given data is harder. Yet, undirected models are ubiquitous in computer vision and text modelling (e.g. conditional random fields). But where Bayes ..."
Abstract

Cited by 13 (5 self)
 Add to MetaCart
(Show Context)
While learning the maximum likelihood value of parameters of an undirected graphical model is hard, modelling the posterior distribution over parameters given data is harder. Yet, undirected models are ubiquitous in computer vision and text modelling (e.g. conditional random fields). But where Bayesian approaches for directed models have been very successful, a proper Bayesian treatment of undirected models in still in its infant stages. We propose a new method for approximating the posterior of the parameters given data based on the Laplace approximation. This approximation requires the computation of the covariance matrix over features which we compute using the linear response approximation based in turn on loopy belief propagation. We develop the theory for conditional and “unconditional ” random fields with or without hidden variables. In the conditional setting we introduce a new variant of bagging suitable for structured domains. Here we run the loopy maxproduct algorithm on a “supergraph ” composed of graphs for individual models sampled from the posterior and connected by constraints. Experiments on real world data validate the proposed methods. 1
Continuous contour Monte Carlo for marginal density estimation with an application to a spatial statistical model
 Journal of Computational and Graphical Statistics
, 2007
"... The problem of marginal density estimation for a multivariate density function f(x) can be generally stated as a problem of density function estimation for a random vector λ(x) of dimension lower than that of x. In this article, we propose a technique, the socalled continuous Contour Monte Carlo (C ..."
Abstract

Cited by 12 (6 self)
 Add to MetaCart
The problem of marginal density estimation for a multivariate density function f(x) can be generally stated as a problem of density function estimation for a random vector λ(x) of dimension lower than that of x. In this article, we propose a technique, the socalled continuous Contour Monte Carlo (CCMC) algorithm, for solving this problem. CCMC can be viewed as a continuous version of the contour Monte Carlo (CMC) algorithm recently proposed in the literature. CCMC abandons the use of sample space partitioning and incorporates the techniques of kernel density estimation into its simulations. CCMC is more general than other marginal density estimation algorithms. First, it works for any density functions, even for those having a rugged or unbalanced energy landscape. Second, it works for any transformation λ(x) regardless of the availability of the analytical form of the inverse transformation. In this article, CCMC is applied to estimate the unknown normalizing constant function for a spatial autologistic model, and the estimate is then used in a Bayesian analysis for the spatial autologistic model in place of the true normalizing constant function. Numerical results on the U.S. cancer mortality data indicate that the Bayesian method can produce much more accurate estimates than the MPLE and MCMLE methods for the parameters of the spatial autologistic model.