Results 1  10
of
10
Using a Markov chain to construct a tractable approximation of an intractable probability distribution
 SCANDINAVIAN JOURNAL OF STATISTICS
, 2005
"... ..."
A mixture representation of π with applications in Markov chain Monte Carlo and perfect
, 2004
"... Let X = {Xn:n = 0,1,2,...} be an irreducible, positive recurrent Markov chain with invariant probability measure π. We show that if X satisfies a onestep minorization condition, then π can be represented as an infinite mixture. The distributions in the mixture are associated with the hitting times ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
Let X = {Xn:n = 0,1,2,...} be an irreducible, positive recurrent Markov chain with invariant probability measure π. We show that if X satisfies a onestep minorization condition, then π can be represented as an infinite mixture. The distributions in the mixture are associated with the hitting times on an accessible atom introduced via the splitting construction of Athreya and Ney [Trans. Amer. Math. Soc. 245 (1978) 493–501] and Nummelin [Z. Wahrsch. Verw. Gebiete 43 (1978) 309–318]. When the small set in the minorization condition is the entire state space, our mixture representation of π reduces to a simple formula, first derived by Breyer and Roberts [Methodol. Comput. Appl. Probab. 3 (2001) 161–177] from which samples can be easily drawn. Despite the fact that the derivation of this formula involves no coupling or backward simulation arguments, the formula can be used to reconstruct perfect sampling algorithms based on coupling from the past (CFTP) such as Murdoch and Green’s [Scand. J. Statist. 25 (1998) 483–502] Multigamma Coupler and Wilson’s [Random Structures Algorithms 16 (2000) 85–113] ReadOnce CFTP algorithm. In the general case where the state space is not necessarily 1small, under the assumption that X satisfies a geometric drift condition, our mixture representation can be used to construct an arbitrarily accurate approximation to π from which it is straightforward to sample. One potential application of this approximation is as a starting distribution for a Markov chain Monte Carlo algorithm based on X.
Small sets and Markov transition densities
, 2002
"... The theory of general statespace Markov chains can be strongly related to the case of discrete statespace by use of the notion of small sets and associated minorization conditions. The general theory shows that small sets exist for all Markov chains on statespaces with countably generated σalgeb ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
The theory of general statespace Markov chains can be strongly related to the case of discrete statespace by use of the notion of small sets and associated minorization conditions. The general theory shows that small sets exist for all Markov chains on statespaces with countably generated σalgebras, though the minorization provided by the theory concerns small sets of order n and nstep transition kernels for some unspecified n. Partly motivated by the growing importance of small sets for Markov chain Monte Carlo and Coupling from the Past, we show that in general there need be no small sets of order n = 1 even if the kernel is assumed to have a density function (though of course one can take n = 1 if the kernel density is continuous). However n = 2 will suffice for kernels with densities (integral kernels), and in fact small sets of order 2 abound in the technical sense that the 2step kernel density can be expressed as a countable sum of nonnegative separable summands based on small sets. This can be exploited to produce a representation using a latent discrete Markov chain; indeed one might say, inside every Markov chain with measurable transition density there is a discrete statespace Markov chain struggling to escape. We conclude by discussing complements to these results, including their relevance to Harrisrecurrent Markov chains and we relate the counterexample to Turán problems for bipartite graphs.
ELECTRONIC COMMUNICATIONS in PROBABILITY A REGENERATION PROOF OF THE CENTRAL LIMIT THEOREM FOR UNIFORMLY ERGODIC MARKOV CHAINS
, 2007
"... Central limit theorems for functionals of general state space Markov chains are of crucial importance in sensible implementation of Markov chain Monte Carlo algorithms as well as of vital theoretical interest. Different approaches to proving this type of results under diverse assumptions led to a la ..."
Abstract
 Add to MetaCart
(Show Context)
Central limit theorems for functionals of general state space Markov chains are of crucial importance in sensible implementation of Markov chain Monte Carlo algorithms as well as of vital theoretical interest. Different approaches to proving this type of results under diverse assumptions led to a large variety of CTL versions. However due to the recent development of the regeneration theory of Markov chains, many classical CLTs can be reproved using this intuitive probabilistic approach, avoiding technicalities of original proofs. In this paper we provide a characterization of CLTs for ergodic Markov chains via regeneration and then use the result to solve the open problem posed in [17]. We then discuss the difference between onestep and multiplestep small set condition. 1
ISSN 01692690Monte Carlo Methods of PageRank Computation
, 2004
"... We describe and analyze an online Monte Carlo method of PageRank computation. The PageRank is being estimated basing on results of a large number of short independent simulation runs initiated from each page that contains outgoing hyperlinks. The method does not require any storage of the hyperlink ..."
Abstract
 Add to MetaCart
(Show Context)
We describe and analyze an online Monte Carlo method of PageRank computation. The PageRank is being estimated basing on results of a large number of short independent simulation runs initiated from each page that contains outgoing hyperlinks. The method does not require any storage of the hyperlink matrix and is highly parallelizable. We study confidence intervals, and discover drawbacks of the absolute error criterion and the relative error criterion. Further, we suggest a socalled weighted relative error criterion, which ensures a good accuracy in a relatively small number of simulation runs. Moreover, with the weighted relative error measure, the complexity of the algorithm does not depend on the web structure.
Spécialite ́ : INFORMATIQUE par
, 2010
"... Présentée pour obtenir le titre de: ..."
(Show Context)
London Mathematical Society ISSN 1461{1570 PERFECT POSTERIOR SIMULATION FOR MIXTURE AND HIDDEN MARKOV MODELS
"... In this paper we present an application of readonce coupling from the past to problems in Bayesian inference for latent statistical models. We describe a method of simulating perfectly from the posterior distribution of the unknown mixture weights in a mixture model. Our method is extended to a ..."
Abstract
 Add to MetaCart
(Show Context)
In this paper we present an application of readonce coupling from the past to problems in Bayesian inference for latent statistical models. We describe a method of simulating perfectly from the posterior distribution of the unknown mixture weights in a mixture model. Our method is extended to a more general mixture problem where unknown parameters exist for the mixture components, and to a hidden Markov model. 1.