• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 504,795
Next 10 →

Reversible Markov chains and random walks on graphs

by David Aldous, James Allen Fill , 2002
"... ..."
Abstract - Cited by 549 (13 self) - Add to MetaCart
Abstract not found

Markov chains for exploring posterior distributions

by Luke Tierney - Annals of Statistics , 1994
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract - Cited by 1122 (6 self) - Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at

An introduction to hidden Markov models

by L. R. Rabiner, B. H. Juang - IEEE ASSp Magazine , 1986
"... The basic theory of Markov chains has been known to ..."
Abstract - Cited by 1110 (2 self) - Add to MetaCart
The basic theory of Markov chains has been known to

Finite state Markov-chain approximations to univariate and vector autoregressions

by George Tauchen - Economics Letters , 1986
"... The paper develops a procedure for finding a discrete-valued Markov chain whose sample paths approximate well those of a vector autoregression. The procedure has applications in those areas of economics, finance, and econometrics where approximate solutions to integral equations are required. 1. ..."
Abstract - Cited by 472 (0 self) - Add to MetaCart
The paper develops a procedure for finding a discrete-valued Markov chain whose sample paths approximate well those of a vector autoregression. The procedure has applications in those areas of economics, finance, and econometrics where approximate solutions to integral equations are required. 1.

Exact Sampling with Coupled Markov Chains and Applications to Statistical Mechanics

by James Gary Propp, David Bruce Wilson , 1996
"... For many applications it is useful to sample from a finite set of objects in accordance with some particular distribution. One approach is to run an ergodic (i.e., irreducible aperiodic) Markov chain whose stationary distribution is the desired distribution on this set; after the Markov chain has ..."
Abstract - Cited by 548 (13 self) - Add to MetaCart
For many applications it is useful to sample from a finite set of objects in accordance with some particular distribution. One approach is to run an ergodic (i.e., irreducible aperiodic) Markov chain whose stationary distribution is the desired distribution on this set; after the Markov chain

Reversible jump Markov chain Monte Carlo computation and Bayesian model determination

by Peter J. Green - Biometrika , 1995
"... Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some xed standard underlying measure. They have therefore not been available for application to Bayesian model determi ..."
Abstract - Cited by 1330 (24 self) - Add to MetaCart
Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some xed standard underlying measure. They have therefore not been available for application to Bayesian model

Probabilistic Inference Using Markov Chain Monte Carlo Methods

by Radford M. Neal , 1993
"... Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difficulties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over high-dimensional spaces. R ..."
Abstract - Cited by 738 (24 self) - Add to MetaCart
. Related problems in other fields have been tackled using Monte Carlo methods based on sampling using Markov chains, providing a rich array of techniques that can be applied to problems in artificial intelligence. The "Metropolis algorithm" has been used to solve difficult problems in statistical

Markov chain sampling methods for Dirichlet process mixture models

by Radford M. Neal - JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS , 2000
"... ..."
Abstract - Cited by 620 (5 self) - Add to MetaCart
Abstract not found

Coupled hidden Markov models for complex action recognition

by Matthew Brand, Nuria Oliver, Alex Pentland , 1996
"... We present algorithms for coupling and training hidden Markov models (HMMs) to model interacting processes, and demonstrate their superiority to conventional HMMs in a vision task classifying two-handed actions. HMMs are perhaps the most successful framework in perceptual computing for modeling and ..."
Abstract - Cited by 497 (22 self) - Add to MetaCart
We present algorithms for coupling and training hidden Markov models (HMMs) to model interacting processes, and demonstrate their superiority to conventional HMMs in a vision task classifying two-handed actions. HMMs are perhaps the most successful framework in perceptual computing for modeling

Segmentation of brain MR images through a hidden Markov random field model and the expectation-maximization algorithm

by Yongyue Zhang, Michael Brady, Stephen Smith - IEEE TRANSACTIONS ON MEDICAL. IMAGING , 2001
"... The finite mixture (FM) model is the most commonly used model for statistical segmentation of brain magnetic resonance (MR) images because of its simple mathematical form and the piecewise constant nature of ideal brain MR images. However, being a histogram-based model, the FM has an intrinsic limi ..."
Abstract - Cited by 619 (14 self) - Add to MetaCart
-based methods produce unreliable results. In this paper, we propose a novel hidden Markov random field (HMRF) model, which is a stochastic process generated by a MRF whose state sequence cannot be observed directly but which can be indirectly estimated through observations. Mathematically, it can be shown
Next 10 →
Results 1 - 10 of 504,795
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University