Results 1  10
of
40
Markov chain monte carlo convergence diagnostics
 JASA
, 1996
"... A critical issue for users of Markov Chain Monte Carlo (MCMC) methods in applications is how to determine when it is safe to stop sampling and use the samples to estimate characteristics of the distribution of interest. Research into methods of computing theoretical convergence bounds holds promise ..."
Abstract

Cited by 371 (6 self)
 Add to MetaCart
(Show Context)
A critical issue for users of Markov Chain Monte Carlo (MCMC) methods in applications is how to determine when it is safe to stop sampling and use the samples to estimate characteristics of the distribution of interest. Research into methods of computing theoretical convergence bounds holds promise for the future but currently has yielded relatively little that is of practical use in applied work. Consequently, most MCMC users address the convergence problem by applying diagnostic tools to the output produced by running their samplers. After giving a brief overview of the area, we provide an expository review of thirteen convergence diagnostics, describing the theoretical basis and practical implementation of each. We then compare their performance in two simple models and conclude that all the methods can fail to detect the sorts of convergence failure they were designed to identify. We thus recommend a combination of strategies aimed at evaluating and accelerating MCMC sampler convergence, including applying diagnostic procedures to a small number of parallel chains, monitoring autocorrelations and crosscorrelations, and modifying parameterizations or sampling algorithms appropriately. We emphasize, however, that it is not possible to say with certainty that a finite sample from an MCMC algorithm is representative of an underlying stationary distribution. 1
MCMC methods for continuoustime financial econometrics

, 2003
"... This chapter develops Markov Chain Monte Carlo (MCMC) methods for Bayesian inference in continuoustime asset pricing models. The Bayesian solution to the inference problem is the distribution of parameters and latent variables conditional on observed data, and MCMC methods provide a tool for explor ..."
Abstract

Cited by 41 (1 self)
 Add to MetaCart
This chapter develops Markov Chain Monte Carlo (MCMC) methods for Bayesian inference in continuoustime asset pricing models. The Bayesian solution to the inference problem is the distribution of parameters and latent variables conditional on observed data, and MCMC methods provide a tool for exploring these highdimensional, complex distributions. We first provide a description of the foundations and mechanics of MCMC algorithms. This includes a discussion of the CliffordHammersley theorem, the Gibbs sampler, the MetropolisHastings algorithm, and theoretical convergence properties of MCMC algorithms. We next provide a tutorial on building MCMC algorithms for a range of continuoustime asset pricing models. We include detailed examples for equity price models, option pricing models, term structure models, and regimeswitching models. Finally, we discuss the issue of sequential Bayesian inference, both for parameters and state variables.
Geometric Ergodicity of Gibbs and Block Gibbs Samplers for a Hierarchical Random Effects Model
, 1998
"... We consider fixed scan Gibbs and block Gibbs samplers for a Bayesian hierarchical random effects model with proper conjugate priors. A drift condition given in Meyn and Tweedie (1993, Chapter 15) is used to show that these Markov chains are geometrically ergodic. Showing that a Gibbs sampler is geom ..."
Abstract

Cited by 40 (11 self)
 Add to MetaCart
We consider fixed scan Gibbs and block Gibbs samplers for a Bayesian hierarchical random effects model with proper conjugate priors. A drift condition given in Meyn and Tweedie (1993, Chapter 15) is used to show that these Markov chains are geometrically ergodic. Showing that a Gibbs sampler is geometrically ergodic is the first step towards establishing central limit theorems, which can be used to approximate the error associated with Monte Carlo estimates of posterior quantities of interest. Thus, our results will be of practical interest to researchers using these Gibbs samplers for Bayesian data analysis. Key words and phrases: Bayesian model, Central limit theorem, Drift condition, Markov chain, Monte Carlo, Rate of convergence, Variance Components AMS 1991 subject classifications: Primary 60J27, secondary 62F15 1 Introduction Gelfand and Smith (1990, Section 3.4) introduced the Gibbs sampler for the hierarchical oneway random effects model with proper conjugate priors. Rosen...
MCMC Methods for Financial Econometrics
 Handbook of Financial Econometrics
, 2002
"... This chapter discusses Markov Chain Monte Carlo (MCMC) based methods for es timating continuoustime asset pricing models. We describe the Bayesian approach to empirical asset pricing, the mechanics of MCMC algorithms and the strong theoretical underpinnings of MCMC algorithms. We provide a tuto ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
This chapter discusses Markov Chain Monte Carlo (MCMC) based methods for es timating continuoustime asset pricing models. We describe the Bayesian approach to empirical asset pricing, the mechanics of MCMC algorithms and the strong theoretical underpinnings of MCMC algorithms. We provide a tutorial on building MCMC algo rithms and show how to estimate equity price models with factors such as stochastic expected returns, stochastic volatility and jumps, multifactor term structure models with stochastic volatility, timevarying central tenclancy or jumps and regime switching models.
Estimation and Inference via Bayesian Simulation: An Introduction to Markov Chain Monte Carlo
, 2000
"... ..."
(Show Context)
Gibbs Sampling
 Journal of the American Statistical Association
, 1995
"... 8> R f(`)d`. To marginalize, say for ` i ; requires h(` i ) = R f(`)d` (i) = R f(`)d` where ` (i) denotes all components of ` save ` i : To obtain Eg(` i ) requires similar integration; to obtain the marginal distribution of say g(`) or its expectation requires similar integration. When p i ..."
Abstract

Cited by 28 (0 self)
 Add to MetaCart
8> R f(`)d`. To marginalize, say for ` i ; requires h(` i ) = R f(`)d` (i) = R f(`)d` where ` (i) denotes all components of ` save ` i : To obtain Eg(` i ) requires similar integration; to obtain the marginal distribution of say g(`) or its expectation requires similar integration. When p is large (as it will be in the applications we envision) such integration is analytically infeasible (the socalled curse of dimensionality*). Gibbs sampling provides a Monte Carlo approach for carrying out such integrations. In what sorts of settings would we have need to mar
Rates of Convergence for Data Augmentation on Finite Sample Spaces
 Ann. Appl. Prob
, 1993
"... this paper, we examine this rate of convergence more carefully. We restrict our attention to the case where ..."
Abstract

Cited by 24 (11 self)
 Add to MetaCart
(Show Context)
this paper, we examine this rate of convergence more carefully. We restrict our attention to the case where
A simulation approach to convergence rates for Markov chain Monte Carlo algorithms
 Stat. and Comput
, 1996
"... Markov chain Monte Carlo (MCMC) methods, including the Gibbs sampler and the MetropolisHastings algorithm, are very commonly used in Bayesian statistics for sampling from complicated, highdimensional posterior distributions. A continuing source of uncertainty is how long such a sampler must be run ..."
Abstract

Cited by 22 (10 self)
 Add to MetaCart
(Show Context)
Markov chain Monte Carlo (MCMC) methods, including the Gibbs sampler and the MetropolisHastings algorithm, are very commonly used in Bayesian statistics for sampling from complicated, highdimensional posterior distributions. A continuing source of uncertainty is how long such a sampler must be run in order to converge approximately to its target stationary distribution. Rosenthal (1995b) presents a method to compute rigorous theoretical upper bounds on the number of iterations required to achieve a specified degree of convergence in total variation distance by verifying drift and minorization conditions. We propose the use of auxiliary simulations to estimate the numerical values needed in Rosenthal's theorem. Our simulation method makes it possible to compute quantitative convergence bounds for models for which the requisite analytical computations would be prohibitively difficult or impossible. On the other hand, although our method appears to perform well in our example problems...
Possible biases induced by MCMC convergence diagnostics
, 1997
"... This paper is organised as follows. In Section 2, we present an oversimplified version of a convergence diagnostic, and study analytically its performance on certain simple Markov chains. We restrict ourselves primarily to chains which in fact produce i.i.d. samples from ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
(Show Context)
This paper is organised as follows. In Section 2, we present an oversimplified version of a convergence diagnostic, and study analytically its performance on certain simple Markov chains. We restrict ourselves primarily to chains which in fact produce i.i.d. samples from
Conditions for rapid mixing of parallel and simulated tempering on multimodal Submitted, Annals of Applied Probability
, 2007
"... Abstract: We give conditions under which a Markov chain constructed via parallel or simulated tempering is guaranteed to be rapidly mixing, which are applicable to a wide range of multimodal distributions arising in Bayesian statistical inference and statistical mechanics. We provide lower bounds on ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
(Show Context)
Abstract: We give conditions under which a Markov chain constructed via parallel or simulated tempering is guaranteed to be rapidly mixing, which are applicable to a wide range of multimodal distributions arising in Bayesian statistical inference and statistical mechanics. We provide lower bounds on the spectral gaps of parallel and simulated tempering. These bounds imply a single set of sufficient conditions for rapid mixing of both techniques. A direct consequence of our results is rapid mixing of parallel and simulated tempering for several normal mixture models, and for the meanfield Ising model.