Results 1 - 10
of
10
Bayesian state-space modelling on high-performance hardware using LibBi
, 2013
"... LibBi is a software package for state-space modelling and Bayesian inference on modern computer hardware, including multi-core central processing units (CPUs), many-core graphics processing units (GPUs) and distributed-memory clusters of such devices. The software parses a domain-specific language f ..."
Abstract
-
Cited by 5 (1 self)
- Add to MetaCart
(Show Context)
LibBi is a software package for state-space modelling and Bayesian inference on modern computer hardware, including multi-core central processing units (CPUs), many-core graphics processing units (GPUs) and distributed-memory clusters of such devices. The software parses a domain-specific language for model specification, then optimises, generates, compiles and runs code for the given model, inference method and hardware platform. In presenting the software, this work serves as an introduction to state-space models and the specialised methods developed for Bayesian inference with them. The focus is on sequential Monte Carlo (SMC) methods such as the particle filter for state estimation, and the particle Markov chain Monte Carlo (PMCMC) and SMC2 methods for parameter estimation. All are well-suited to current computer hardware. Two examples are given and developed throughout, one a linear three-element windkessel model of the human arterial system, the other a nonlinear Lorenz ’96 model. These are specified in the prescribed modelling language, and LibBi demonstrated by performing inference with them. Empirical results are presented, including a performance comparison of the software with different hardware configurations.
Towards Automatic Model Comparison An Adaptive Sequential Monte Carlo Approach
, 2013
"... Model comparison for the purposes of selection, averaging and validation is a problem found throughout statistics and related disciplines. Within the Bayesian paradigm, these problems all require the calculation of the posterior probabilities of models within a particular class. Substantial progress ..."
Abstract
-
Cited by 4 (0 self)
- Add to MetaCart
Model comparison for the purposes of selection, averaging and validation is a problem found throughout statistics and related disciplines. Within the Bayesian paradigm, these problems all require the calculation of the posterior probabilities of models within a particular class. Substantial progress has been made in recent years, but there are numerous difficulties in the practical implementation of existing schemes. This paper presents adaptive sequential Monte Carlo (SMC) sampling strategies to characterise the posterior distribution of a collection of models, as well as the parameters of those models. Both a simple product estimator and a combination of SMC and a path sampling estimator are considered and existing theoretical results are extended to include the path sampling variant. A novel approach to the automatic specification of distributions within SMC algorithms is presented and shown to outperform the state of the art in this area. The performance of the proposed strategies is demonstrated via an extensive simulation study making use of the Gaussian mixture model and two challenging realistic examples. Comparisons with state of the art algorithms show that the proposed algorithms are always competitive, and often substantially superior to alternative techniques, at equal computational cost and considerably less application-specific implementation effort.
On the particle Gibbs sampler
, 2013
"... Abstract. The particle Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm which operates on the extended space of the auxiliary variables generated by an interacting particle system. In particular, it samples the discrete variables that determine the particle genealogy. We propose a coupli ..."
Abstract
-
Cited by 3 (0 self)
- Add to MetaCart
(Show Context)
Abstract. The particle Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm which operates on the extended space of the auxiliary variables generated by an interacting particle system. In particular, it samples the discrete variables that determine the particle genealogy. We propose a coupling construction between two particle Gibbs updates from different starting points, which is such that the coupling probability may be made arbitrary large by taking the particle system large enough. A direct consequence of this result is the uniform ergodicity of the Particle Gibbs Markov kernel. We discuss several algorithmic variations of Particle Gibbs, either proposed in the literature or original. For some of these variants we are able to prove that they dominate the original algorithm in asymptotic efficiency as measured by the variance of the central limit theorem’s limiting distribution. A detailed numerical study is provided to demonstrate the efficacy of Particle Gibbs and the proposed variants. 1.
Particle Metropolis-Hastings using gradient and Hessian information
, 2014
"... Particle Metropolis-Hastings (PMH) allows for Bayesian parameter in-ference in nonlinear state space models by combining Markov chain Monte Carlo (MCMC) and particle filtering. The latter is used to estimate the intractable likelihood. In its original formulation, PMH makes use of a marginal MCMC pr ..."
Abstract
-
Cited by 2 (1 self)
- Add to MetaCart
(Show Context)
Particle Metropolis-Hastings (PMH) allows for Bayesian parameter in-ference in nonlinear state space models by combining Markov chain Monte Carlo (MCMC) and particle filtering. The latter is used to estimate the intractable likelihood. In its original formulation, PMH makes use of a marginal MCMC proposal for the parameters, typically a Gaussian ran-dom walk. However, this can lead to a poor exploration of the parameter space and an inefficient use of the generated particles. We propose a number of alternative versions of PMH that incorporate gradient and Hessian information about the posterior into the proposal. This information is more or less obtained as a byproduct of the likelihood estimation. Indeed, we show how to estimate the required information using a fixed-lag particle smoother, with a computational cost growing linearly in the number of particles. We conclude that the proposed meth-ods can: (i) decrease the length of the burn-in phase, (ii) increase the mixing of the Markov chain at the stationary phase, and (iii) make the proposal distribution scale invariant which simplifies tuning.
Discussion of “Riemann manifold Langevin and Hamiltonian Monte Carlo methods ” by
"... The authors are to be congratulated for a novel design of an efficient and automatic choice of the preconditioning matrix for MALA or mass matrix for HMC schemes. The clever use of local curvature information results in possible improvements in the relative speed of convergence to the high dimension ..."
Abstract
- Add to MetaCart
The authors are to be congratulated for a novel design of an efficient and automatic choice of the preconditioning matrix for MALA or mass matrix for HMC schemes. The clever use of local curvature information results in possible improvements in the relative speed of convergence to the high dimensional target distribution, as demonstrated by the authors using various illustrative examples. The full MMALA and RMHMC schemes described by the authors require (a) evaluations of the partial derivatives up to the third order for the log-likelihood function and (b) inversion of the position specific metric tensor of the Riemann manifold formed by the parameter space. In the general case, considering the absence of nice analytical properties inducing sparsity in the covariance matrix etc., these two steps are computationally intensive (as pointed out by the authors) and in many of the examples the authors are forced to resort to a simplified version of the MMALA scheme. One interesting application would be to use the authors ’ approach for an
SMC2: A sequential Monte Carlo algorithm with particle Markov chain Monte Carlo updates: Supplement
"... The aim of this section is to illustrate the good performance of SMC2 in two additional examples, and compare it to alternative methods. Unless specified otherwise, the ESS criterion is set to 80%. The PMCMC moves are Particle Marginal Metropolis–Hastings moves with a random walk proposal. The rando ..."
Abstract
- Add to MetaCart
The aim of this section is to illustrate the good performance of SMC2 in two additional examples, and compare it to alternative methods. Unless specified otherwise, the ESS criterion is set to 80%. The PMCMC moves are Particle Marginal Metropolis–Hastings moves with a random walk proposal. The random walk is Gaussian with variance equal to cΣ where Σ is the variance of the current particles and c is set to 10%. We use a reflective random walk: if the prior distribution is defined on [a,∞ [ for instance, and if the proposed value y is lower than a, we propose y ∗ = a + |y − a | = 2a − y instead of y. This results in a cost-free improvement of the acceptance ratio. Only one PMMH move for each θ-particle is performed at each resample-move step, except when specified in section 3. These simulations results can be reproduced using the generic software package py-smc2, written in Python and C, available at