Results 1  10
of
125
On Sequential Monte Carlo Sampling Methods for Bayesian Filtering
 STATISTICS AND COMPUTING
, 2000
"... In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is develop ..."
Abstract

Cited by 1051 (76 self)
 Add to MetaCart
(Show Context)
In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is developed that unifies many of the methods which have been proposed over the last few decades in several different scientific disciplines. Novel extensions to the existing methods are also proposed. We show in particular how to incorporate local linearisation methods similar to those which have previously been employed in the deterministic filtering literature; these lead to very effective importance distributions. Furthermore we describe a method which uses RaoBlackwellisation in order to take advantage of the analytic structure present in some important classes of statespace models. In a final section we develop algorithms for prediction, smoothing and evaluation of the likelihood in dynamic models.
Dynamic Bayesian Networks: Representation, Inference and Learning
, 2002
"... Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have bee ..."
Abstract

Cited by 770 (3 self)
 Add to MetaCart
Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have been used for problems ranging from tracking planes and missiles to predicting the economy. However, HMMs
and KFMs are limited in their “expressive power”. Dynamic Bayesian Networks (DBNs) generalize HMMs by allowing the state space to be represented in factored form, instead of as a single discrete random variable. DBNs generalize KFMs by allowing arbitrary probability distributions, not just (unimodal) linearGaussian. In this thesis, I will discuss how to represent many different kinds of models as DBNs, how to perform exact and approximate inference in DBNs, and how to learn DBN models from sequential data.
In particular, the main novel technical contributions of this thesis are as follows: a way of representing
Hierarchical HMMs as DBNs, which enables inference to be done in O(T) time instead of O(T 3), where T is the length of the sequence; an exact smoothing algorithm that takes O(log T) space instead of O(T); a simple way of using the junction tree algorithm for online inference in DBNs; new complexity bounds on exact online inference in DBNs; a new deterministic approximate inference algorithm called factored frontier; an analysis of the relationship between the BK algorithm and loopy belief propagation; a way of
applying RaoBlackwellised particle filtering to DBNs in general, and the SLAM (simultaneous localization
and mapping) problem in particular; a way of extending the structural EM algorithm to DBNs; and a variety of different applications of DBNs. However, perhaps the main value of the thesis is its catholic presentation of the field of sequential data modelling.
Sequential Monte Carlo Methods for Dynamic Systems
 Journal of the American Statistical Association
, 1998
"... A general framework for using Monte Carlo methods in dynamic systems is provided and its wide applications indicated. Under this framework, several currently available techniques are studied and generalized to accommodate more complex features. All of these methods are partial combinations of three ..."
Abstract

Cited by 664 (13 self)
 Add to MetaCart
(Show Context)
A general framework for using Monte Carlo methods in dynamic systems is provided and its wide applications indicated. Under this framework, several currently available techniques are studied and generalized to accommodate more complex features. All of these methods are partial combinations of three ingredients: importance sampling and resampling, rejection sampling, and Markov chain iterations. We deliver a guideline on how they should be used and under what circumstance each method is most suitable. Through the analysis of differences and connections, we consolidate these methods into a generic algorithm by combining desirable features. In addition, we propose a general use of RaoBlackwellization to improve performances. Examples from econometrics and engineering are presented to demonstrate the importance of RaoBlackwellization and to compare different Monte Carlo procedures. Keywords: Blind deconvolution; Bootstrap filter; Gibbs sampling; Hidden Markov model; Kalman filter; Markov...
Sequential Monte Carlo Samplers
, 2002
"... In this paper, we propose a general algorithm to sample sequentially from a sequence of probability distributions known up to a normalizing constant and defined on a common space. A sequence of increasingly large artificial joint distributions is built; each of these distributions admits a marginal ..."
Abstract

Cited by 303 (44 self)
 Add to MetaCart
In this paper, we propose a general algorithm to sample sequentially from a sequence of probability distributions known up to a normalizing constant and defined on a common space. A sequence of increasingly large artificial joint distributions is built; each of these distributions admits a marginal which is a distribution of interest. To sample from these distributions, we use sequential Monte Carlo methods. We show that these methods can be interpreted as interacting particle approximations of a nonlinear FeynmanKac flow in distribution space. One interpretation of the FeynmanKac flow corresponds to a nonlinear Markov kernel admitting a specified invariant distribution and is a natural nonlinear extension of the standard MetropolisHastings algorithm. Many theoretical results have already been established for such flows and their particle approximations. We demonstrate the use of these algorithms through simulation.
Convergence of Sequential Monte Carlo Methods
 SEQUENTIAL MONTE CARLO METHODS IN PRACTICE
, 2000
"... Bayesian estimation problems where the posterior distribution evolves over time through the accumulation of data arise in many applications in statistics and related fields. Recently, a large number of algorithms and applications based on sequential Monte Carlo methods (also known as particle filter ..."
Abstract

Cited by 243 (13 self)
 Add to MetaCart
Bayesian estimation problems where the posterior distribution evolves over time through the accumulation of data arise in many applications in statistics and related fields. Recently, a large number of algorithms and applications based on sequential Monte Carlo methods (also known as particle filtering methods) have appeared in the literature to solve this class of problems; see (Doucet, de Freitas & Gordon, 2001) for a survey. However, few of these methods have been proved to converge rigorously. The purpose of this paper is to address this issue. We present a general sequential Monte Carlo (SMC) method which includes most of the important features present in current SMC methods. This method generalizes and encompasses many recent algorithms. Under mild regularity conditions, we obtain rigorous convergence results for this general SMC method and therefore give theoretical backing for the validity of all the algorithms that can be obtained as particular cases of it.
Mixture Kalman filters
, 2000
"... In treating dynamic systems,sequential Monte Carlo methods use discrete samples to represent a complicated probability distribution and use rejection sampling, importance sampling and weighted resampling to complete the online `filtering' task. We propose a special sequential Monte Carlo metho ..."
Abstract

Cited by 224 (8 self)
 Add to MetaCart
(Show Context)
In treating dynamic systems,sequential Monte Carlo methods use discrete samples to represent a complicated probability distribution and use rejection sampling, importance sampling and weighted resampling to complete the online `filtering' task. We propose a special sequential Monte Carlo method,the mixture Kalman filter, which uses a random mixture of the Gaussian distributions to approximate a target distribution. It is designed for online estimation and prediction of conditional and partial conditional dynamic linear models,which are themselves a class of widely used nonlinear systems and also serve to approximate many others. Compared with a few available filtering methods including Monte Carlo methods,the gain in efficiency that is provided by the mixture Kalman filter can be very substantial. Another contribution of the paper is the formulation of many nonlinear systems into conditional or partial conditional linear form,to which the mixture Kalman filter can be applied. Examples in target tracking and digital communications are given to demonstrate the procedures proposed.
The Unscented Particle Filter
, 2000
"... In this paper, we propose a new particle filter based on sequential importance sampling. The algorithm uses a bank of unscented filters to obtain the importance proposal distribution. This proposal has two very "nice" properties. Firstly, it makes efficient use of the latest available info ..."
Abstract

Cited by 211 (8 self)
 Add to MetaCart
In this paper, we propose a new particle filter based on sequential importance sampling. The algorithm uses a bank of unscented filters to obtain the importance proposal distribution. This proposal has two very "nice" properties. Firstly, it makes efficient use of the latest available information and, secondly, it can have heavy tails. As a result, we find that the algorithm outperforms standard particle filtering and other nonlinear filtering methods very substantially. This experimental finding is in agreement with the theoretical convergence proof for the algorithm. The algorithm also includes resampling and (possibly) Markov chain Monte Carlo (MCMC) steps.
Particle Filters for State Estimation of Jump Markov Linear Systems
, 2001
"... Jump Markov linear systems (JMLS) are linear systems whose parameters evolve with time according to a finite state Markov chain. In this paper, our aim is to recursively compute optimal state estimates for this class of systems. We present efficient simulationbased algorithms called particle filter ..."
Abstract

Cited by 177 (15 self)
 Add to MetaCart
(Show Context)
Jump Markov linear systems (JMLS) are linear systems whose parameters evolve with time according to a finite state Markov chain. In this paper, our aim is to recursively compute optimal state estimates for this class of systems. We present efficient simulationbased algorithms called particle filters to solve the optimal filtering problem as well as the optimal fixedlag smoothing problem. Our algorithms combine sequential importance sampling, a selection scheme, and Markov chain Monte Carlo methods. They use several variance reduction methods to make the most of the statistical structure of JMLS. Computer