Results 1  10
of
10
Particle Gibbs with Ancestor Sampling
"... Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a new PMCMC algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS ..."
Abstract

Cited by 8 (6 self)
 Add to MetaCart
(Show Context)
Particle Markov chain Monte Carlo (PMCMC) is a systematic way of combining the two main tools used for Monte Carlo statistical inference: sequential Monte Carlo (SMC) and Markov chain Monte Carlo (MCMC). We present a new PMCMC algorithm that we refer to as particle Gibbs with ancestor sampling (PGAS). PGAS provides the data analyst with an offtheshelf class of Markov kernels that can be used to simulate, for instance, the typically highdimensional and highly autocorrelated state trajectory in a statespace model. The ancestor sampling procedure enables fast mixing of the PGAS kernel even when using seemingly few particles in the underlying SMC sampler. This is important as it can significantly reduce the computational burden that is typically associated with using SMC. PGAS is conceptually similar to the existing PG with backward simulation (PGBS) procedure. Instead of using separate forward and backward sweeps as in PGBS, however, we achieve the same effect in a single forward sweep. This makes PGAS well suited for addressing inference problems not only in statespace models, but also in models with more complex dependencies, such as nonMarkovian, Bayesian nonparametric, and general probabilistic graphical models.
Identification of Gaussian Process StateSpace Models with Particle Stochastic Approximation EM
"... Abstract: Gaussian process statespace models (GPSSMs) are a very flexible family of models of nonlinear dynamical systems. They comprise a Bayesian nonparametric representation of the dynamics of the system and additional (hyper)parameters governing the properties of this nonparametric representa ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Abstract: Gaussian process statespace models (GPSSMs) are a very flexible family of models of nonlinear dynamical systems. They comprise a Bayesian nonparametric representation of the dynamics of the system and additional (hyper)parameters governing the properties of this nonparametric representation. The Bayesian formalism enables systematic reasoning about the uncertainty in the system dynamics. We present an approach to maximum likelihood identification of the parameters in GPSSMs, while retaining the full nonparametric description of the dynamics. The method is based on a stochastic approximation version of the EM algorithm that employs recent developments in particle Markov chain Monte Carlo for efficient identification.
Particle filterbased Gaussian process optimisation for parameter inference
 In Proceedings of the 19th World Congress of the International Federation of Automatic Control (IFAC). Cape Town, South Africa
, 2014
"... We propose a novel method for maximum likelihoodbased parameter inference in nonlinear and/or nonGaussian state space models. The method is an iterative procedure with three steps. At each iteration a particle filter is used to estimate the value of the loglikelihood function at the current par ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
We propose a novel method for maximum likelihoodbased parameter inference in nonlinear and/or nonGaussian state space models. The method is an iterative procedure with three steps. At each iteration a particle filter is used to estimate the value of the loglikelihood function at the current parameter iterate. Using these loglikelihood estimates, a surrogate objective function is created by utilizing a Gaussian process model. Finally, we use a heuristic procedure to obtain a revised parameter iterate, providing an automatic tradeoff between exploration and exploitation of the surrogate model. The method is profiled on two state space models with good performance both considering accuracy and computational cost. 1
MetropolisHastings within partially collapsed Gibbs samplers
 Journal of Computational and Graphical Statistical (submitted
, 2014
"... The Partially Collapsed Gibbs (PCG) sampler offers a new strategy for improving the convergence of a Gibbs sampler. PCG achieves faster convergence by reducing the conditioning in some of the draws of its parent Gibbs sampler. Although this can significantly improve convergence, care must be taken ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The Partially Collapsed Gibbs (PCG) sampler offers a new strategy for improving the convergence of a Gibbs sampler. PCG achieves faster convergence by reducing the conditioning in some of the draws of its parent Gibbs sampler. Although this can significantly improve convergence, care must be taken to ensure that the stationary distribution is preserved. The conditional distributions sampled in a PCG sampler may be incompatible and permuting their order may upset the stationary distribution of the chain. Extra care must be taken when MetropolisHastings (MH) updates are used in some or all of the updates. Reducing the conditioning in an MH within Gibbs sampler can change the stationary distribution, even when the PCG sampler would work perfectly if MH were not used. In fact, a number of samplers of this sort that have been advocated in the literature do not actually have the target stationary distributions. In this article, we illustrate the challenges that may arise when using MH within a PCG sampler and develop a general strategy for using such updates while maintaining the desired stationary distribution. Theoretical arguments provide guidance when choosing between different MH within PCG sampling schemes. Finally we illustrate the MH within PCG sampler and its computational advantage using several examples from our applied work.
NONLINEAR SYSTEM IDENTIFICATION USING PARTICLE FILTERS
"... Particle filters are computational methods opening up for systematic inference in nonlinear/nonGaussian state space models. The particle filters constitute the most popular sequential Monte Carlo (SMC) methods. This is a relatively recent development and the aim here is to provide a brief exposi ..."
Abstract
 Add to MetaCart
Particle filters are computational methods opening up for systematic inference in nonlinear/nonGaussian state space models. The particle filters constitute the most popular sequential Monte Carlo (SMC) methods. This is a relatively recent development and the aim here is to provide a brief exposition of these SMC methods and how they are key enabling algorithms in solving nonlinear system identification problems. The particle filters are important for both frequentistic (maximum likelihood) and Bayesian nonlinear system identification. Index Terms — Particle filter, particle smoother, sequential Monte Carlo, maximum likelihood, Bayesian, MCMC, particle MCMC and backward simulation.
Particle filtering based identification for autonomous nonlinear ODE models ⋆
"... Abstract: This paper presents a new blackbox algorithm for identification of a nonlinear autonomous system in stable periodic motion. The particle filtering based algorithm models the signal as the output of a continuoustime second order ordinary differential equation (ODE). The model is selected ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract: This paper presents a new blackbox algorithm for identification of a nonlinear autonomous system in stable periodic motion. The particle filtering based algorithm models the signal as the output of a continuoustime second order ordinary differential equation (ODE). The model is selected based on previous work which proves that a second order ODE is sufficient to model a wide class of nonlinear systems with periodic modes of motion, also systems that are described by higher order ODEs. Such systems are common in systems biology. The proposed algorithm is applied to data from the wellknown HodgkinHuxley neuron model. This is a challenging problem since the HodgkinHuxley model is a fourth order model, but has a mode of oscillation in a second order subspace. The numerical experiments show that the proposed algorithm does indeed solve the problem.
Sequential Monte Carlo Methods for System Identification
, 2015
"... One of the key challenges in identifying nonlinear and possibly nonGaussian state space models (SSMs) is the intractability of estimating the system state. Sequential Monte Carlo (SMC) methods, such as the particle filter (introduced more than two decades ago), provide numerical solutions to the n ..."
Abstract
 Add to MetaCart
One of the key challenges in identifying nonlinear and possibly nonGaussian state space models (SSMs) is the intractability of estimating the system state. Sequential Monte Carlo (SMC) methods, such as the particle filter (introduced more than two decades ago), provide numerical solutions to the nonlinear state estimation problems arising in SSMs. When combined with additional identification techniques, these algorithms provide solid solutions to the nonlinear system identification problem. We describe two general strategies for creating such combinations and discuss why SMC is a natural tool for implementing these strategies.
Identification of jump Markov linear models using particle filters
, 2014
"... Jump Markov linear models consists of a finite number of linear state space models and a discrete variable encoding the jumps (or switches) between the different linear models. Identifying jump Markov linear models makes for a challenging problem lacking an analytical solution. We derive a new expec ..."
Abstract
 Add to MetaCart
(Show Context)
Jump Markov linear models consists of a finite number of linear state space models and a discrete variable encoding the jumps (or switches) between the different linear models. Identifying jump Markov linear models makes for a challenging problem lacking an analytical solution. We derive a new expectation maximization (EM) type algorithm that produce maximum likelihood estimates of the model parameters. Our development hinges upon recent progress in combining particle filters with Markov chain Monte Carlo methods in solving the nonlinear state smoothing problem inherent in the EM formulation. Key to our development is that we exploit a conditionally linear Gaussian substructure in the model, allowing for an efficient algorithm. 1