Results 1  10
of
374,613
Markov Chain Monte Carlo Methods in Statistical Physics
, 2005
"... In this paper we shall briefly review a few Markov Chain Monte Carlo methods for simulating closed systems described by canonical ensembles. We cover both Boltzmann and nonBoltzmann sampling techniques. The Metropolis algorithm is a typical example of Boltzmann Monte Carlo method. We discuss the ti ..."
Abstract
 Add to MetaCart
In this paper we shall briefly review a few Markov Chain Monte Carlo methods for simulating closed systems described by canonical ensembles. We cover both Boltzmann and nonBoltzmann sampling techniques. The Metropolis algorithm is a typical example of Boltzmann Monte Carlo method. We discuss
Markov chain Monte Carlo methods for statistical inference
, 2004
"... These notes provide an introduction to Markov chain Monte Carlo methods and their applications to both Bayesian and frequentist statistical inference. Such methods have revolutionized what can be achieved computationally, especially in the Bayesian paradigm. The account begins by discussing ordinary ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
These notes provide an introduction to Markov chain Monte Carlo methods and their applications to both Bayesian and frequentist statistical inference. Such methods have revolutionized what can be achieved computationally, especially in the Bayesian paradigm. The account begins by discussing
Probabilistic Inference Using Markov Chain Monte Carlo Methods
, 1993
"... Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difficulties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over highdimensional spaces. R ..."
Abstract

Cited by 735 (24 self)
 Add to MetaCart
. Related problems in other fields have been tackled using Monte Carlo methods based on sampling using Markov chains, providing a rich array of techniques that can be applied to problems in artificial intelligence. The "Metropolis algorithm" has been used to solve difficult problems in statistical
MarkovChain Monte Carlo Methods for Simulations of Biomolecules
, 709
"... The computer revolution has been driven by a sustained increase of computational speed of approximately one order of magnitude (a factor of ten) every five years since about 1950. In natural sciences this has led to a continuous increase of the importance of computer simulations. Major enabling tech ..."
Abstract
 Add to MetaCart
techniques are Markov Chain Monte Carlo (MCMC) and Molecular Dynamics (MD) simulations. This article deals with the MCMC approach. First basic simulation techniques, as well as methods for their statistical analysis are reviewed. Afterwards the focus is on generalized ensembles and biased updating, two
Mathematical foundations of the Markov chain Monte Carlo method
 in Probabilistic Methods for Algorithmic Discrete Mathematics
, 1998
"... 7.2 was jointly undertaken with Vivek Gore, and is published here for the first time. I also thank an anonymous referee for carefully reading and providing helpful comments on a draft of this chapter. 1. Introduction The classical Monte Carlo method is an approach to estimating quantities that a ..."
Abstract

Cited by 35 (1 self)
 Add to MetaCart
7.2 was jointly undertaken with Vivek Gore, and is published here for the first time. I also thank an anonymous referee for carefully reading and providing helpful comments on a draft of this chapter. 1. Introduction The classical Monte Carlo method is an approach to estimating quantities
Markov Chain Monte Carlo Methods in Molecular Computing
"... core data of molecular biology consists of DNA sequences. Molecular computation focuses on the computational power of molecules, especially that of biological molecules, and attempts to realize information processing which maximally exploits the computational power of molecules. DNA sequences can ho ..."
Abstract
 Add to MetaCart
hold information of arbitral complexity by freely chaining four natural bases. Similarly, biological molecules such as RNA and proteins are appropriate for molecular computation, because they share this combinatorial complexity. It is worth mentioning that the combinatorial complexity underlies
Discussion of Particle Markov chain Monte Carlo methods by
"... The authors present an elegant theory for novel methodology which makes Bayesian inference practical on implicit models. I will use their example, a sophisticated financial model involving a continuous time stochastic volatility process driven by Lévy noise, to compare their methodology with a stat ..."
Abstract
 Add to MetaCart
The authors present an elegant theory for novel methodology which makes Bayesian inference practical on implicit models. I will use their example, a sophisticated financial model involving a continuous time stochastic volatility process driven by Lévy noise, to compare their methodology with a stateoftheart nonBayesian approach. I applied iterated filtering (Ionides et al., 2006, 2009) implemented via the mif function in the R package pomp (King et al., 2008). Fig. 1 shows some results from applying the iterated filtering algorithm with 1000 particles to the simulation study described by the authors in section 3.2. If θ denotes the parameter vector of interest, the algorithm generates a sequence of parameter estimates θ̂1, θ̂2,... converging to the maximum likelihood estimate θ̂. As a diagnostic, the loglikelihood of θ̂i is plotted against i (Fig. 1(a)). We see the sequence of loglikelihoods rapidly converges. On simulation studies like this, a quick check for successful maximization is to observe that the maximized loglikelihood typically exceeds the loglikelihood at the true parameter value by approximately half the number of estimated parameters (Fig. 1(a)). One can also check for successful local maximization by sliced likelihood plots (Fig. 1(be)), in which the likelihood surface is explored along one of the parameters, keeping the other parameters fixed at the estimated local maximum. The likelihood surface is seen to be flat as λ varies, consistent with the authors ’ observation that some parameter combinations are weakly identified in this model. A profile likelihood analysis could aid the investigation of the identifiability issue. Due to the quick convergence of iterated filtering with a relatively small number of particles, many profile likelihood plots can be generated at the computational expense of, say, one MCMC run of length 50,000. The decision about whether one wishes to carry out a Bayesian analysis should depend on whether one wishes to impose a prior distribution on unknown parameters. Here, I have shown that likelihoodbased nonBayesian methodology provides a computationally viable alternative to the authors ’ Bayesian approach for complex dynamic models. 1
Results 1  10
of
374,613