• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations
Advanced Search Include Citations

Ecological non-linear state space model selection via adaptive particle Markov chain Monte (0)

by G W Peters, G R Hosack, K R Hayes
Add To MetaCart

Tools

Sorted by:
Results 1 - 10 of 10

Bayesian state-space modelling on high-performance hardware using LibBi

by Lawrence M. Murray , 2013
"... LibBi is a software package for state-space modelling and Bayesian inference on modern computer hardware, including multi-core central processing units (CPUs), many-core graphics processing units (GPUs) and distributed-memory clusters of such devices. The software parses a domain-specific language f ..."
Abstract - Cited by 5 (1 self) - Add to MetaCart
LibBi is a software package for state-space modelling and Bayesian inference on modern computer hardware, including multi-core central processing units (CPUs), many-core graphics processing units (GPUs) and distributed-memory clusters of such devices. The software parses a domain-specific language for model specification, then optimises, generates, compiles and runs code for the given model, inference method and hardware platform. In presenting the software, this work serves as an introduction to state-space models and the specialised methods developed for Bayesian inference with them. The focus is on sequential Monte Carlo (SMC) methods such as the particle filter for state estimation, and the particle Markov chain Monte Carlo (PMCMC) and SMC2 methods for parameter estimation. All are well-suited to current computer hardware. Two examples are given and developed throughout, one a linear three-element windkessel model of the human arterial system, the other a nonlinear Lorenz ’96 model. These are specified in the prescribed modelling language, and LibBi demonstrated by performing inference with them. Empirical results are presented, including a performance comparison of the software with different hardware configurations.
(Show Context)

Citation Context

...Ms) have important applications in the study of physical, chemical and biological processes. Examples are numerous, but include marine biogeochemistry [16, 37, 17, 59], ecological population dynamics =-=[73, 57, 60, 35]-=-, Functional Magnetic Resonance Imaging [64, 52, 53], biochemistry [29, 30] and object tracking [71]. They are particularly useful for modelling uncertainties in the parameters, states and observation...

Towards Automatic Model Comparison An Adaptive Sequential Monte Carlo Approach

by Yan Zhou, Adam M. Johansen, John A. D. Aston , 2013
"... Model comparison for the purposes of selection, averaging and validation is a problem found throughout statistics and related disciplines. Within the Bayesian paradigm, these problems all require the calculation of the posterior probabilities of models within a particular class. Substantial progress ..."
Abstract - Cited by 4 (0 self) - Add to MetaCart
Model comparison for the purposes of selection, averaging and validation is a problem found throughout statistics and related disciplines. Within the Bayesian paradigm, these problems all require the calculation of the posterior probabilities of models within a particular class. Substantial progress has been made in recent years, but there are numerous difficulties in the practical implementation of existing schemes. This paper presents adaptive sequential Monte Carlo (SMC) sampling strategies to characterise the posterior distribution of a collection of models, as well as the parameters of those models. Both a simple product estimator and a combination of SMC and a path sampling estimator are considered and existing theoretical results are extended to include the path sampling variant. A novel approach to the automatic specification of distributions within SMC algorithms is presented and shown to outperform the state of the art in this area. The performance of the proposed strategies is demonstrated via an extensive simulation study making use of the Gaussian mixture model and two challenging realistic examples. Comparisons with state of the art algorithms show that the proposed algorithms are always competitive, and often substantially superior to alternative techniques, at equal computational cost and considerably less application-specific implementation effort.

On the particle Gibbs sampler

by N. Chopin, S. S. Singh , 2013
"... Abstract. The particle Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm which operates on the extended space of the auxiliary variables generated by an interacting particle system. In particular, it samples the discrete variables that determine the particle genealogy. We propose a coupli ..."
Abstract - Cited by 3 (0 self) - Add to MetaCart
Abstract. The particle Gibbs sampler is a Markov chain Monte Carlo (MCMC) algorithm which operates on the extended space of the auxiliary variables generated by an interacting particle system. In particular, it samples the discrete variables that determine the particle genealogy. We propose a coupling construction between two particle Gibbs updates from different starting points, which is such that the coupling probability may be made arbitrary large by taking the particle system large enough. A direct consequence of this result is the uniform ergodicity of the Particle Gibbs Markov kernel. We discuss several algorithmic variations of Particle Gibbs, either proposed in the literature or original. For some of these variants we are able to prove that they dominate the original algorithm in asymptotic efficiency as measured by the variance of the central limit theorem’s limiting distribution. A detailed numerical study is provided to demonstrate the efficacy of Particle Gibbs and the proposed variants. 1.
(Show Context)

Citation Context

...diverse research contributions, whether methodological (Silva et al., 2009; Whiteley et al., 2010; Chopin et al., 2012; Lindsten et al., 2012) or applied, the latter in domains as diverse as Ecology (=-=Peters et al., 2010-=-), Electricity Forecasting (Launay et al., 2012), Finance (Pitt et al., 2012), systems Biology (Golightly and Wilkinson, 2011), study of social networks (Everitt, 2012), Hydrology (Vrugt et al., 2012)...

Particle Metropolis-Hastings using gradient and Hessian information

by Johan Dahlin, Fredrik Lindsten, Thomas B. Schön , 2014
"... Particle Metropolis-Hastings (PMH) allows for Bayesian parameter in-ference in nonlinear state space models by combining Markov chain Monte Carlo (MCMC) and particle filtering. The latter is used to estimate the intractable likelihood. In its original formulation, PMH makes use of a marginal MCMC pr ..."
Abstract - Cited by 2 (1 self) - Add to MetaCart
Particle Metropolis-Hastings (PMH) allows for Bayesian parameter in-ference in nonlinear state space models by combining Markov chain Monte Carlo (MCMC) and particle filtering. The latter is used to estimate the intractable likelihood. In its original formulation, PMH makes use of a marginal MCMC proposal for the parameters, typically a Gaussian ran-dom walk. However, this can lead to a poor exploration of the parameter space and an inefficient use of the generated particles. We propose a number of alternative versions of PMH that incorporate gradient and Hessian information about the posterior into the proposal. This information is more or less obtained as a byproduct of the likelihood estimation. Indeed, we show how to estimate the required information using a fixed-lag particle smoother, with a computational cost growing linearly in the number of particles. We conclude that the proposed meth-ods can: (i) decrease the length of the burn-in phase, (ii) increase the mixing of the Markov chain at the stationary phase, and (iii) make the proposal distribution scale invariant which simplifies tuning.
(Show Context)

Citation Context

... The reverse of these large steps can have very low probability which prevents them from being accepted. One interesting direction for future work is therefore to pursue adaptive algorithms (see e.g. =-=[3, 33, 35]-=-), to automatically tune the step lengths during the different phases of the algorithms. It can also be useful to (either adaptively on non-adaptively) use different step lengths for the drift term an...

Discussion of “Riemann manifold Langevin and Hamiltonian Monte Carlo methods ” by

by M. Girolami, B. Calderhead, Anindya Bhadra
"... The authors are to be congratulated for a novel design of an efficient and automatic choice of the preconditioning matrix for MALA or mass matrix for HMC schemes. The clever use of local curvature information results in possible improvements in the relative speed of convergence to the high dimension ..."
Abstract - Add to MetaCart
The authors are to be congratulated for a novel design of an efficient and automatic choice of the preconditioning matrix for MALA or mass matrix for HMC schemes. The clever use of local curvature information results in possible improvements in the relative speed of convergence to the high dimensional target distribution, as demonstrated by the authors using various illustrative examples. The full MMALA and RMHMC schemes described by the authors require (a) evaluations of the partial derivatives up to the third order for the log-likelihood function and (b) inversion of the position specific metric tensor of the Riemann manifold formed by the parameter space. In the general case, considering the absence of nice analytical properties inducing sparsity in the covariance matrix etc., these two steps are computationally intensive (as pointed out by the authors) and in many of the examples the authors are forced to resort to a simplified version of the MMALA scheme. One interesting application would be to use the authors ’ approach for an

:1

by Salima El
"... iv ..."
Abstract - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...result include epidemiology, meteorology, neuroscience, ecology (see [IBAK11]) and finance (see [JPS09]). For example, our result can be applied to the five ecological state space models described in =-=[PHH10]-=-. Although the scope of our method is general, we have chosen to focus on the so-called autoregressive process AR(1) with measurement noise which has been widely studied and on which our main result c...

N. CHOPIN CREST-ENSAE

by P. E. Jacob, Universite ́ Paris Dauphine, O. Papaspiliopoulos
"... models ..."
Abstract - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...lity, at time t = 500, which is about 100 time steps after the large jump in volatility at time t = 407. The results for both algorithms are compared to those from a long PMCMC run (implemented as in =-=Peters et al., 2010-=-, and detailed in the Supplement) with Nx = 500 and 10 5 iterations. Figure 2(c) reports on the estimation of the log evidence log p(y1:t) 16 Chopin et al. Time Sq ua re dsob se rv a tio ns 0 2 4 6 8 ...

On Feynman-Kac and particle Markov chain Monte Carlo

by unknown authors , 2014
"... iv ..."
Abstract - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...of application domains, including in statistical machine leaning [5, 33, 38, 48], finance and econometrics [13, 25, 30, 40, 43], biology [31, 36, 44], computer sciences [32], environmental statistics =-=[26, 27, 42]-=-, social networks analysis [29], signal processing [39, 41], forecasting and data assimilation [37, 35, 47], among other fields. The convergence analysis of the PMCMC models has also been started in a...

Stochastic Volatility Model

by Salima El Kolei
"... ar ..."
Abstract - Add to MetaCart
Abstract not found
(Show Context)

Citation Context

...cles filtering methods and MCMC methods to estimate the vector of parameters θ0. From a computational point of view, this approach is expensive and we refer the reader to [ADH10] for more details. In =-=[PHH10]-=-, they propose an adaptive PMCMC method to estimate ecological hidden stochastic models. We propose here an approach based on M-estimation: It consists in the optimisation of a well-chosen contrast fu...

SMC2: A sequential Monte Carlo algorithm with particle Markov chain Monte Carlo updates: Supplement

by unknown authors
"... The aim of this section is to illustrate the good performance of SMC2 in two additional examples, and compare it to alternative methods. Unless specified otherwise, the ESS criterion is set to 80%. The PMCMC moves are Particle Marginal Metropolis–Hastings moves with a random walk proposal. The rando ..."
Abstract - Add to MetaCart
The aim of this section is to illustrate the good performance of SMC2 in two additional examples, and compare it to alternative methods. Unless specified otherwise, the ESS criterion is set to 80%. The PMCMC moves are Particle Marginal Metropolis–Hastings moves with a random walk proposal. The random walk is Gaussian with variance equal to cΣ where Σ is the variance of the current particles and c is set to 10%. We use a reflective random walk: if the prior distribution is defined on [a,∞ [ for instance, and if the proposed value y is lower than a, we propose y ∗ = a + |y − a | = 2a − y instead of y. This results in a cost-free improvement of the acceptance ratio. Only one PMMH move for each θ-particle is performed at each resample-move step, except when specified in section 3. These simulations results can be reproduced using the generic software package py-smc2, written in Python and C, available at
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University