Results

**1 - 3**of**3**### A New Approach to Probabilistic Programming Inference

"... We introduce and demonstrate a new ap-proach to inference in expressive probabilis-tic programming languages based on particle Markov chain Monte Carlo. Our approach is simple to implement and easy to paral-lelize. It applies to Turing-complete proba-bilistic programming languages and supports accur ..."

Abstract
- Add to MetaCart

(Show Context)
We introduce and demonstrate a new ap-proach to inference in expressive probabilis-tic programming languages based on particle Markov chain Monte Carlo. Our approach is simple to implement and easy to paral-lelize. It applies to Turing-complete proba-bilistic programming languages and supports accurate inference in models that make use of complex control flow, including stochas-tic recursion. It also includes primitives from Bayesian nonparametric statistics. Our experiments show that this approach can be more efficient than previously introduced single-site Metropolis-Hastings methods. 1

### Expectation-Propagation for Summary-Less, Likelihood-Free Inference

"... Many models of interest in the natural and social sciences have no closed-form likelihood function, which means that they cannot be treated using the usual techniques of statistical inference. In the case where such models can be efficiently simulated, Bayesian inference is still possible thanks to ..."

Abstract
- Add to MetaCart

Many models of interest in the natural and social sciences have no closed-form likelihood function, which means that they cannot be treated using the usual techniques of statistical inference. In the case where such models can be efficiently simulated, Bayesian inference is still possible thanks to the Approximate Bayesian Computation (ABC) algorithm. Although many refinements have since been suggested, the technique suffers from three major shortcomings. First, it requires introducing a vector of “summary statistics”, the choice of which is arbitrary and may lead to strong biases. Second, ABC may be excruciatingly slow due to very low acceptance rates. Third, it cannot produce a reliable estimate of the marginal likelihood of the model. We introduce a technique that solves the first and the third issues, and considerably alleviates the second. We adapt to the likelihood-free context a variational approximation algorithm, Expectation Propagation (Minka, 2001). The resulting algorithm is shown to be faster by a few orders of magnitude than alternative algorithms, while producing an overall approximation error which is typically negligible. Comparisons are performed in three real-world applications which are typical of likelihood-free inference, including one application in neuroscience which is novel, and possibly too challenging for standard ABC techniques.

### resampling in sequential Monte Carlo

, 2014

"... An information-theoretic analysis of ..."

(Show Context)