Results 1  10
of
30
Stability properties of some particle filters,” The Annals of Applied Probability
, 2013
"... ar ..."
ON THE USE OF BACKWARD SIMULATION IN THE PARTICLE GIBBS SAMPLER
"... The particle Gibbs (PG) sampler was introduced in [1] as a way to incorporate a particle filter (PF) in a Markov chain Monte Carlo (MCMC) sampler. The resulting method was shown to be an efficient tool for joint Bayesian parameter and state inference in nonlinear, nonGaussian statespace models. Ho ..."
Abstract

Cited by 10 (6 self)
 Add to MetaCart
(Show Context)
The particle Gibbs (PG) sampler was introduced in [1] as a way to incorporate a particle filter (PF) in a Markov chain Monte Carlo (MCMC) sampler. The resulting method was shown to be an efficient tool for joint Bayesian parameter and state inference in nonlinear, nonGaussian statespace models. However, the mixing of the PG kernel can be very poor when there is severe degeneracy in the PF. Hence, the success of the PG sampler heavily relies on the, often unrealistic, assumption that we can implement a PF without suffering from any considerate degeneracy. However, as pointed out by Whiteley [2] in the discussion following [1], the mixing can be improved by adding a backward simulation step to the PG sampler. Here, we investigate this further, derive an explicit PG sampler with backward simulation (denoted PGBSi) and show that this indeed is a valid MCMC method. Furthermore, we show in a numerical example that backward simulation can lead to a considerable increase in performance over the standard PG sampler. Index Terms — Particle Markov chain Monte Carlo, particle filter, particle Gibbs, backward simulation, Gibbs sampling.
Uniform stability of a particle approximation of the optimal filter derivative
 SIAM J. Control Optim
, 2015
"... Abstract. Particle methods, also known as Sequential Monte Carlo methods, are a principled set of algorithms used to approximate numerically the optimal filter in nonlinear nonGaussian statespace models. However, when performing maximum likelihood parameter inference in statespace models, it is a ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Particle methods, also known as Sequential Monte Carlo methods, are a principled set of algorithms used to approximate numerically the optimal filter in nonlinear nonGaussian statespace models. However, when performing maximum likelihood parameter inference in statespace models, it is also necessary to approximate the derivative of the optimal filter with respect to the parameter of the model. References [G. Poyiadjis, A. Doucet, and S. S. Singh, Particle methods for
2014): “On Particle Methods for Parameter Estimation in StateSpace Models,” arXiv Working Paper
"... Abstract. Nonlinear nonGaussian statespace models are ubiquitous in statistics, econometrics, information engineering and signal processing. Particle methods, also known as Sequential Monte Carlo (SMC) methods, provide reliable numerical approximations to the associated state inference problems. ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Abstract. Nonlinear nonGaussian statespace models are ubiquitous in statistics, econometrics, information engineering and signal processing. Particle methods, also known as Sequential Monte Carlo (SMC) methods, provide reliable numerical approximations to the associated state inference problems. However, in most applications, the statespace model of interest also depends on unknown static parameters that need to be estimated from the data. In this context, standard particle methods fail and it is necessary to rely on more sophisticated algorithms. The aim of this paper is to present a comprehensive review of particle methods that have been proposed to perform static parameter estimation in statespace models. We discuss the advantages and limitations of these methods and illustrate their performance on simple models. Key words and phrases: Bayesian inference, maximum likelihood inference, particle filtering, Sequential Monte Carlo, statespace models. 1.
Identification of Hammerstein–Wiener Models
, 2012
"... This paper develops and illustrates a new maximumlikelihood based method for the identification of Hammerstein–Wiener model structures. A central aspect is that a very general situation is considered wherein multivariable data, noninvertible Hammerstein and Wiener nonlinearities, and coloured stoc ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
This paper develops and illustrates a new maximumlikelihood based method for the identification of Hammerstein–Wiener model structures. A central aspect is that a very general situation is considered wherein multivariable data, noninvertible Hammerstein and Wiener nonlinearities, and coloured stochastic disturbances both before and after the Wiener nonlinearity are all catered for. The method developed here addresses the blind Wiener estimation problem as a special case.
RaoBlackwellized particle smoothers for mixed linear/nonlinear statespace models
, 2011
"... ..."
(Show Context)
Longterm stability of sequential Monte Carlo methods under verifiable conditions
, 2013
"... ar ..."
(Show Context)
Approximate Bayesian computation for smoothing
 Stoch. Anal. Appl
, 2014
"... We consider a method for approximate inference in hidden Markov models (HMMs). The method circumvents the need to evaluate conditional densities of observations given the hidden states. It may be considered an instance of Approximate Bayesian Computation (ABC) and it involves the introduction of au ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
(Show Context)
We consider a method for approximate inference in hidden Markov models (HMMs). The method circumvents the need to evaluate conditional densities of observations given the hidden states. It may be considered an instance of Approximate Bayesian Computation (ABC) and it involves the introduction of auxiliary variables valued in the same space as the observations. The quality of the approximation may be controlled to arbitrary precision through a parameter > 0. We provide theoretical results which quantify, in terms of , the ABC error in approximation of expectations of additive functionals with respect to the smoothing distributions. Under regularity assumptions, this error is O(n), where n is the number of time steps over which smoothing is performed. For numerical implementation we adopt the forwardonly sequential Monte Carlo (SMC) scheme of [16] and quantify the combined error from the ABC and SMC approximations. This forms some of the first quantitative results for ABC methods which jointly treat the ABC and simulation errors, with a finite number of data and simulated samples. When the HMM has unknown static parameters, we consider particle Markov chain Monte Carlo [2] (PMCMC) methods for batch statistical inference.
PARTICLE METROPOLIS HASTINGS USING LANGEVIN DYNAMICS
"... Particle Markov Chain Monte Carlo (PMCMC) samplers allow for routine inference of parameters and states in challenging nonlinear problems. A common choice for the parameter proposal is a simple random walk sampler, which can scale poorly with the number of parameters. In this paper, we propose to us ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Particle Markov Chain Monte Carlo (PMCMC) samplers allow for routine inference of parameters and states in challenging nonlinear problems. A common choice for the parameter proposal is a simple random walk sampler, which can scale poorly with the number of parameters. In this paper, we propose to use loglikelihood gradients, i.e. the score, in the construction of the proposal, akin to the Langevin Monte Carlo method, but adapted to the PMCMC framework. This can be thought of as a way to guide a random walk proposal by using drift terms that are proportional to the score function. The method is successfully applied to a stochastic volatility model and the drift term exhibits intuitive behaviour.
Identification of Mixed Linear/Nonlinear StateSpace Models
"... Abstract — The primary contribution of this paper is an algorithm capable of identifying parameters in certain mixed linear/nonlinear statespace models, containing conditionally linear Gaussian substructures. More specifically, we employ the standard maximum likelihood framework and derive an expec ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Abstract — The primary contribution of this paper is an algorithm capable of identifying parameters in certain mixed linear/nonlinear statespace models, containing conditionally linear Gaussian substructures. More specifically, we employ the standard maximum likelihood framework and derive an expectation maximization type algorithm. This involves a nonlinear smoothing problem for the state variables, which for the conditionally linear Gaussian system can be efficiently solved using a so called RaoBlackwellized particle smoother (RBPS). As a secondary contribution of this paper we extend an existing RBPS to be able to handle the fully interconnected model under study. I.