Results 1  10
of
14,561
On Adaptive MetropolisHastings Methods
"... This paper presents a method for adaptation in MetropolisHastings algorithms. A product of a proposal density and K copies of the target density is used to define a joint density which is sampled by a Gibbs sampler including a Metropolis step. This provides a framework for adaptation since the curr ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper presents a method for adaptation in MetropolisHastings algorithms. A product of a proposal density and K copies of the target density is used to define a joint density which is sampled by a Gibbs sampler including a Metropolis step. This provides a framework for adaptation since
A METROPOLISHASTINGS METHOD FOR LINEAR INVERSE PROBLEMS WITH POISSON LIKELIHOOD AND GAUSSIAN PRIOR
"... Abstract. Poisson noise models arise in a wide range of linear inverse problems in imaging. In the Bayesian setting, the Poisson likelihood function together with a Gaussian prior yields a posterior density function that is not of a well known form and is thus difficult to sample from, especially fo ..."
Abstract
 Add to MetaCart
for largescale problems. In this work, we present a method for computing samples from posterior density functions with Poisson likelihood and Gaussian prior, using a Gaussian approximation of the posterior as an independence proposal within a MetropolisHastings method. To dene our priors, we use Gaussian
EXAMPLES OF INVERSE PROBLEMS COMMON STRUCTURE VARIATIONAL METHODS METROPOLISHASTINGS METHODS CONCLUSIONS Data Assimilation in Fluid Mechanics
"... x(t), y(t), z(t) x ̇ = σ(y − x), y ̇ = rx − y − xz, z ̇ = xy − bz. Find u = v(0) Given noisy observations yj = v(tj) + ηj, j = 1, · · · , J. universitylogo ..."
Abstract
 Add to MetaCart
x(t), y(t), z(t) x ̇ = σ(y − x), y ̇ = rx − y − xz, z ̇ = xy − bz. Find u = v(0) Given noisy observations yj = v(tj) + ηj, j = 1, · · · , J. universitylogo
International Journal for Uncertainty Quantification, 1(1):xxx–xxx, 2015 A METROPOLISHASTINGS METHOD FOR LINEAR INVERSE PROBLEMS WITH POISSON LIKELIHOOD AND GAUSSIAN PRIOR
"... Poisson noise models arise in a wide range of linear inverse problems in imaging. In the Bayesian setting, the Poisson likelihood function together with a Gaussian prior yields a posterior density function that is not of a well known form and is thus difficult to sample from, especially for largesc ..."
Abstract
 Add to MetaCart
scale problems. In this work, we present a method for computing samples from posterior density functions with Poisson likelihood and Gaussian prior, using a Gaussian approximation of the posterior as an independence proposal within a MetropolisHastings framework. To define our priors, we use Gaussian
Marginal Likelihood From the MetropolisHastings Output
 OUTPUT,JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2001
"... This article provides a framework for estimating the marginal likelihood for the purpose of Bayesian model comparisons. The approach extends and completes the method presented in Chib (1995) by overcoming the problems associated with the presence of intractable full conditional densities. The propos ..."
Abstract

Cited by 209 (16 self)
 Add to MetaCart
. The proposed method is developed in the context of MCMC chains produced by the Metropolis–Hastings algorithm, whose building blocks are used both for sampling and marginal likelihood estimation, thus economizing on prerun tuning effort and programming. Experiments involving the logit model for binary data
Kernel Adaptive MetropolisHastings
"... A Kernel Adaptive MetropolisHastings algorithm is introduced, for the purpose of sampling from a target distribution with strongly nonlinear support. The algorithm embeds the trajectory of the Markov chain into a reproducing kernel Hilbert space (RKHS), such that the feature space covariance o ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
A Kernel Adaptive MetropolisHastings algorithm is introduced, for the purpose of sampling from a target distribution with strongly nonlinear support. The algorithm embeds the trajectory of the Markov chain into a reproducing kernel Hilbert space (RKHS), such that the feature space covariance
MetropolisHastings sampling of paths
, 2011
"... We consider the previously unsolved problem of sampling cyclefree paths according to a given distribution from a general network. The problem is difficult because of the combinatorial number of alternatives, which prohibits a complete enumeration of all paths and hence also forbids to compute the n ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
We consider the previously unsolved problem of sampling cyclefree paths according to a given distribution from a general network. The problem is difficult because of the combinatorial number of alternatives, which prohibits a complete enumeration of all paths and hence also forbids to compute the normalizing constant of the sampling distribution. The problem is important because the ability to sample from a known distribution introduces mathematical rigor into many applications that range from route guidance to the estimation of choice models with sampling of alternatives.
Optimal Scaling for Various MetropolisHastings Algorithms
, 2001
"... We review and extend results related to optimal scaling of MetropolisHastings algorithms. We present various theoretical results for the highdimensional limit. We also present simulation studies which confirm the theoretical results in finite dimensional contexts. ..."
Abstract

Cited by 174 (28 self)
 Add to MetaCart
We review and extend results related to optimal scaling of MetropolisHastings algorithms. We present various theoretical results for the highdimensional limit. We also present simulation studies which confirm the theoretical results in finite dimensional contexts.
Improving on the independent MetropolisHastings algorithm
, 2005
"... This paper proposes methods to improve Monte Carlo estimates when the Independent MetropolisHastings Algorithm (IMHA) is used. Our first approach uses a control variate based on the sample generated by the proposal distribution. We derive the variance of our estimator for a fixed sample size n an ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
This paper proposes methods to improve Monte Carlo estimates when the Independent MetropolisHastings Algorithm (IMHA) is used. Our first approach uses a control variate based on the sample generated by the proposal distribution. We derive the variance of our estimator for a fixed sample size n
Results 1  10
of
14,561