• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 13,890
Next 10 →

Markov chains for exploring posterior distributions

by Luke Tierney - Annals of Statistics , 1994
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract - Cited by 1136 (6 self) - Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at

An introduction to hidden Markov models

by L. R. Rabiner, B. H. Juang - IEEE ASSp Magazine , 1986
"... The basic theory of Markov chains has been known to ..."
Abstract - Cited by 1132 (2 self) - Add to MetaCart
The basic theory of Markov chains has been known to

Reversible Markov chains and random walks on graphs

by David Aldous, James Allen Fill , 2002
"... ..."
Abstract - Cited by 541 (12 self) - Add to MetaCart
Abstract not found

Exact Sampling with Coupled Markov Chains and Applications to Statistical Mechanics

by James Gary Propp, David Bruce Wilson , 1996
"... For many applications it is useful to sample from a finite set of objects in accordance with some particular distribution. One approach is to run an ergodic (i.e., irreducible aperiodic) Markov chain whose stationary distribution is the desired distribution on this set; after the Markov chain has ..."
Abstract - Cited by 543 (13 self) - Add to MetaCart
For many applications it is useful to sample from a finite set of objects in accordance with some particular distribution. One approach is to run an ergodic (i.e., irreducible aperiodic) Markov chain whose stationary distribution is the desired distribution on this set; after the Markov chain

Reversible jump Markov chain Monte Carlo computation and Bayesian model determination

by Peter J. Green - Biometrika , 1995
"... Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some xed standard underlying measure. They have therefore not been available for application to Bayesian model determi ..."
Abstract - Cited by 1345 (23 self) - Add to MetaCart
Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some xed standard underlying measure. They have therefore not been available for application to Bayesian model

Probabilistic Inference Using Markov Chain Monte Carlo Methods

by Radford M. Neal , 1993
"... Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difficulties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over high-dimensional spaces. R ..."
Abstract - Cited by 736 (24 self) - Add to MetaCart
. Related problems in other fields have been tackled using Monte Carlo methods based on sampling using Markov chains, providing a rich array of techniques that can be applied to problems in artificial intelligence. The "Metropolis algorithm" has been used to solve difficult problems in statistical

Markov chain sampling methods for Dirichlet process mixture models

by Radford M. Neal - JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS , 2000
"... ..."
Abstract - Cited by 629 (5 self) - Add to MetaCart
Abstract not found

Finite state Markov-chain approximations to univariate and vector autoregressions

by George Tauchen - Economics Letters , 1986
"... The paper develops a procedure for finding a discrete-valued Markov chain whose sample paths approximate well those of a vector autoregression. The procedure has applications in those areas of economics, finance, and econometrics where approximate solutions to integral equations are required. 1. ..."
Abstract - Cited by 493 (0 self) - Add to MetaCart
The paper develops a procedure for finding a discrete-valued Markov chain whose sample paths approximate well those of a vector autoregression. The procedure has applications in those areas of economics, finance, and econometrics where approximate solutions to integral equations are required. 1.

Monte Carlo Statistical Methods

by Christian P. Robert, George Casella , 1998
"... This paper is also the originator of the Markov Chain Monte Carlo methods developed in the following chapters. The potential of these two simultaneous innovations has been discovered much latter by statisticians (Hastings 1970; Geman and Geman 1984) than by of physicists (see also Kirkpatrick et al. ..."
Abstract - Cited by 1498 (30 self) - Add to MetaCart
This paper is also the originator of the Markov Chain Monte Carlo methods developed in the following chapters. The potential of these two simultaneous innovations has been discovered much latter by statisticians (Hastings 1970; Geman and Geman 1984) than by of physicists (see also Kirkpatrick et al

Sequential Monte Carlo Methods for Dynamic Systems

by Jun S. Liu, Rong Chen - Journal of the American Statistical Association , 1998
"... A general framework for using Monte Carlo methods in dynamic systems is provided and its wide applications indicated. Under this framework, several currently available techniques are studied and generalized to accommodate more complex features. All of these methods are partial combinations of three ..."
Abstract - Cited by 664 (13 self) - Add to MetaCart
ingredients: importance sampling and resampling, rejection sampling, and Markov chain iterations. We deliver a guideline on how they should be used and under what circumstance each method is most suitable. Through the analysis of differences and connections, we consolidate these methods into a generic
Next 10 →
Results 1 - 10 of 13,890
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University