Results 1 
3 of
3
Bayesian inference for Markov jump processes with informative observations
"... In this paper we consider the problem of parameter inference for Markov jump process (MJP) representations of stochastic kinetic models. Since transition probabilities are intractable for most processes of interest yet forward simulation is straightforward, Bayesian inference typically proceeds thro ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
In this paper we consider the problem of parameter inference for Markov jump process (MJP) representations of stochastic kinetic models. Since transition probabilities are intractable for most processes of interest yet forward simulation is straightforward, Bayesian inference typically proceeds through computationally intensive methods such as (particle) MCMC. Such methods ostensibly require the ability to simulate trajectories from the conditioned jump process. When observations are highly informative, use of the forward simulator is likely to be inefficient and may even preclude an exact (simulation based) analysis. We therefore propose three methods for improving the efficiency of simulating conditioned jump processes. A conditioned hazard is derived based on an approximation to the jump process, and used to generate endpoint conditioned trajectories for use inside an importance sampling algorithm. We also adapt a recently proposed sequential Monte Carlo scheme to our problem. Essentially, trajectories are reweighted at a set of intermediate time points, with more weight assigned to trajectories that are consistent with the next observation. We consider two implementations of this approach, based on two continuous approximations of the MJP. We compare these constructs for a simple tractable jump process before using them to perform inference for a LotkaVolterra system. The best performing construct is used to infer the parameters governing a simple model of motility regulation in Bacillus subtilis.
Efficient Transition Probability Computation for ContinuousTime Branching Processes via Compressed Sensing
"... Branching processes are a class of continuoustime Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical me ..."
Abstract
 Add to MetaCart
(Show Context)
Branching processes are a class of continuoustime Markov chains (CTMCs) with ubiquitous applications. A general difficulty in statistical inference under partially observed CTMC models arises in computing transition probabilities when the discrete state space is large or uncountable. Classical methods such as matrix exponentiation are infeasible for large or countably infinite state spaces, and samplingbased alternatives are computationally intensive, requiring a large integration step to impute over all possible hidden events. Recent work has successfully applied generating function techniques to computing transition probabilities for linear multitype branching processes. While these techniques often require significantly fewer computations than matrix exponentiation, they also become prohibitive in applications with large populations. We propose a compressed sensing framework that significantly accelerates the generating function method, decreasing computational cost up to a logarithmic factor by only assuming the probability mass of transitions is sparse. We demonstrate accurate and efficient transition probability computations in branching process models for hematopoiesis and transposable element evolution. 1
JUMPMeans: SmallVariance Asymptotics for Markov Jump Processes
"... *These authors contributed equally and are listed alphabetically. Markov jump processes (MJPs) are used to model a wide range of phenomena from disease progression to RNA path folding. However, maximum likelihood estimation of parametric models leads to degenerate trajectories and inferential per ..."
Abstract
 Add to MetaCart
*These authors contributed equally and are listed alphabetically. Markov jump processes (MJPs) are used to model a wide range of phenomena from disease progression to RNA path folding. However, maximum likelihood estimation of parametric models leads to degenerate trajectories and inferential performance is poor in nonparametric models. We take a smallvariance asymptotics (SVA) approach to overcome these limitations. We derive the smallvariance asymptotics for parametric and nonparametric MJPs for both directly observed and hidden state models. In the parametric case we obtain a novel objective function which leads to nondegenerate trajectories. To derive the nonparametric version we introduce the gammagamma process, a novel extension to the gammaexponential process. We propose algorithms for each of these formulations, which we call JUMPmeans. Our experiments demonstrate that JUMPmeans is competitive with or outperforms widely used MJP inference approaches in terms of both speed and reconstruction accuracy. 1.