Results 1  10
of
27
A survey of sequential Monte Carlo methods for economics and finance
, 2009
"... This paper serves as an introduction and survey for economists to the field of sequential Monte Carlo methods which are also known as particle filters. Sequential Monte Carlo methods are simulation based algorithms used to compute the highdimensional and/or complex integrals that arise regularly in ..."
Abstract

Cited by 34 (7 self)
 Add to MetaCart
This paper serves as an introduction and survey for economists to the field of sequential Monte Carlo methods which are also known as particle filters. Sequential Monte Carlo methods are simulation based algorithms used to compute the highdimensional and/or complex integrals that arise regularly in applied work. These methods are becoming increasingly popular in economics and finance; from dynamic stochastic general equilibrium models in macroeconomics to option pricing. The objective of this paper is to explain the basics of the methodology, provide references to the literature, and cover some of the theoretical results that justify the methods in practice.
Efficient Bayesian Inference for Switching StateSpace Models using Particle Markov Chain Monte Carlo Methods
, 2010
"... Switching statespace models (SSSM) are a popular class of time series models that have found many applications in statistics, econometrics and advanced signal processing. Bayesian inference for these models typically relies on Markov chain Monte Carlo (MCMC) techniques. However, even sophisticated ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
Switching statespace models (SSSM) are a popular class of time series models that have found many applications in statistics, econometrics and advanced signal processing. Bayesian inference for these models typically relies on Markov chain Monte Carlo (MCMC) techniques. However, even sophisticated MCMC methods dedicated to SSSM can prove quite inefficient as they update potentially strongly correlated variables oneatatime. Particle Markov chain Monte Carlo (PMCMC) methods are a recently developed class of MCMC algorithms which use particle filters to build efficient proposal distributions in highdimensions [1]. The existing PMCMC methods of [1] are applicable to SSSM, but are restricted to employing standard particle filtering techniques. Yet, in the context of SSSM, much more efficient particle techniques have been developed [22, 23, 24]. In this paper, we extend the PMCMC framework to enable the use of these efficient particle methods within MCMC. We demonstrate the resulting generic methodology on a variety of examples including a multiple changepoints model for welllog data and a model for U.S./U.K. exchange rate data. These new PMCMC algorithms are shown to outperform experimentally stateoftheart MCMC techniques for a fixed computational complexity. Additionally they can be easily parallelized [39] which allows further substantial gains.
2014): “On Particle Methods for Parameter Estimation in StateSpace Models,” arXiv Working Paper
"... Abstract. Nonlinear nonGaussian statespace models are ubiquitous in statistics, econometrics, information engineering and signal processing. Particle methods, also known as Sequential Monte Carlo (SMC) methods, provide reliable numerical approximations to the associated state inference problems. ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
Abstract. Nonlinear nonGaussian statespace models are ubiquitous in statistics, econometrics, information engineering and signal processing. Particle methods, also known as Sequential Monte Carlo (SMC) methods, provide reliable numerical approximations to the associated state inference problems. However, in most applications, the statespace model of interest also depends on unknown static parameters that need to be estimated from the data. In this context, standard particle methods fail and it is necessary to rely on more sophisticated algorithms. The aim of this paper is to present a comprehensive review of particle methods that have been proposed to perform static parameter estimation in statespace models. We discuss the advantages and limitations of these methods and illustrate their performance on simple models. Key words and phrases: Bayesian inference, maximum likelihood inference, particle filtering, Sequential Monte Carlo, statespace models. 1.
Marginal likelihood for Markovswitching and changepoint GARCH models
, 2011
"... GARCH volatility models with fixed parameters are too restrictive for long time series due to breaks in the volatility process. Flexible alternatives are Markovswitching GARCH and changepoint GARCH models. They require estimation by MCMC methods due to the path dependence problem. An unsolved issu ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
GARCH volatility models with fixed parameters are too restrictive for long time series due to breaks in the volatility process. Flexible alternatives are Markovswitching GARCH and changepoint GARCH models. They require estimation by MCMC methods due to the path dependence problem. An unsolved issue is the computation of their marginal likelihood, which is essential for determining the number of regimes or changepoints. We solve the problem by using particle MCMC, a technique proposed by Andrieu, Doucet, and Holenstein (2010). We examine the performance of this new method on simulated data, and we illustrate its use on several return series.
Martingale unobserved component models
, 2013
"... I discuss models which allow the local level model, which rationalised exponentially weighted moving averages, to have a timevarying signal/noise ratio. I call this a martingale component model. This makes the rate of discounting of data local. I show how to handle such models effectively using an ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
I discuss models which allow the local level model, which rationalised exponentially weighted moving averages, to have a timevarying signal/noise ratio. I call this a martingale component model. This makes the rate of discounting of data local. I show how to handle such models effectively using an auxiliary particle filter which deploys M Kalman filters run in parallel competing against one another. Here one thinks of M as being 1,000 or more. The model is applied to inflation forecasting. The model generalises to unobserved component models where Gaussian shocks are replaced by martingale difference sequences.
Particle Markov Chain Monte Carlo
, 2009
"... ... have emerged as the two main tools to sample from highdimensional probability distributions. Although asymptotic convergence of MCMC algorithms is ensured under weak assumptions, the performance of these latters is unreliable when the proposal distributions used to explore the space are poorly ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
... have emerged as the two main tools to sample from highdimensional probability distributions. Although asymptotic convergence of MCMC algorithms is ensured under weak assumptions, the performance of these latters is unreliable when the proposal distributions used to explore the space are poorly chosen and/or if highly correlated variables are updated independently. In this thesis we propose a new Monte Carlo framework in which we build efficient highdimensional proposal distributions using SMC methods. This allows us to design effective MCMC algorithms in complex scenarios where standard strategies fail. We demonstrate these algorithms on a number of example problems, including simulated tempering, nonlinear nonGaussian statespace model, and protein folding.
Particle MetropolisHastings using gradient and Hessian information∗
, 2014
"... Particle MetropolisHastings (PMH) allows for Bayesian parameter inference in nonlinear state space models by combining Markov chain Monte Carlo (MCMC) and particle filtering. The latter is used to estimate the intractable likelihood. In its original formulation, PMH makes use of a marginal MCMC pr ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Particle MetropolisHastings (PMH) allows for Bayesian parameter inference in nonlinear state space models by combining Markov chain Monte Carlo (MCMC) and particle filtering. The latter is used to estimate the intractable likelihood. In its original formulation, PMH makes use of a marginal MCMC proposal for the parameters, typically a Gaussian random walk. However, this can lead to a poor exploration of the parameter space and an inefficient use of the generated particles. We propose a number of alternative versions of PMH that incorporate gradient and Hessian information about the posterior into the proposal. This information is more or less obtained as a byproduct of the likelihood estimation. Indeed, we show how to estimate the required information using a fixedlag particle smoother, with a computational cost growing linearly in the number of particles. We conclude that the proposed methods can: (i) decrease the length of the burnin phase, (ii) increase the mixing of the Markov chain at the stationary phase, and (iii) make the proposal distribution scale invariant which simplifies tuning. ∗The final publication is available at Springer via:
Generalized Method of Moments with Latent Variables ∗
, 2012
"... The contribution of generalized method of moments (Hansen and Singleton, 1982) was to allow frequentist inference regarding the parameters of a nonlinear structural model without having to solve the model. Provided there were no latent variables. The contribution of this paper is the same. With late ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
The contribution of generalized method of moments (Hansen and Singleton, 1982) was to allow frequentist inference regarding the parameters of a nonlinear structural model without having to solve the model. Provided there were no latent variables. The contribution of this paper is the same. With latent variables.
Parallel Implementation of Particle MCMC Methods on a GPU ⋆
"... Abstract: This paper examines the problem of estimating the parameters describing system models of quite general nonlinear and multivariable form. The approach is a computational one in which quantities that are intractable to evaluate exactly are approximated by sample averages from randomized alg ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract: This paper examines the problem of estimating the parameters describing system models of quite general nonlinear and multivariable form. The approach is a computational one in which quantities that are intractable to evaluate exactly are approximated by sample averages from randomized algorithms. The main contribution is to illustrate the viability and utility of this approach by examining how high computational loads can be simply managed using commodity hardware. The proposed algorithms and solution architectures are profiled on concrete examples.