Results 1  10
of
53
On Sequential Monte Carlo Sampling Methods for Bayesian Filtering
 STATISTICS AND COMPUTING
, 2000
"... In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is develop ..."
Abstract

Cited by 1051 (76 self)
 Add to MetaCart
(Show Context)
In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is developed that unifies many of the methods which have been proposed over the last few decades in several different scientific disciplines. Novel extensions to the existing methods are also proposed. We show in particular how to incorporate local linearisation methods similar to those which have previously been employed in the deterministic filtering literature; these lead to very effective importance distributions. Furthermore we describe a method which uses RaoBlackwellisation in order to take advantage of the analytic structure present in some important classes of statespace models. In a final section we develop algorithms for prediction, smoothing and evaluation of the likelihood in dynamic models.
Dynamic Bayesian Networks: Representation, Inference and Learning
, 2002
"... Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have bee ..."
Abstract

Cited by 770 (3 self)
 Add to MetaCart
Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have been used for problems ranging from tracking planes and missiles to predicting the economy. However, HMMs
and KFMs are limited in their “expressive power”. Dynamic Bayesian Networks (DBNs) generalize HMMs by allowing the state space to be represented in factored form, instead of as a single discrete random variable. DBNs generalize KFMs by allowing arbitrary probability distributions, not just (unimodal) linearGaussian. In this thesis, I will discuss how to represent many different kinds of models as DBNs, how to perform exact and approximate inference in DBNs, and how to learn DBN models from sequential data.
In particular, the main novel technical contributions of this thesis are as follows: a way of representing
Hierarchical HMMs as DBNs, which enables inference to be done in O(T) time instead of O(T 3), where T is the length of the sequence; an exact smoothing algorithm that takes O(log T) space instead of O(T); a simple way of using the junction tree algorithm for online inference in DBNs; new complexity bounds on exact online inference in DBNs; a new deterministic approximate inference algorithm called factored frontier; an analysis of the relationship between the BK algorithm and loopy belief propagation; a way of
applying RaoBlackwellised particle filtering to DBNs in general, and the SLAM (simultaneous localization
and mapping) problem in particular; a way of extending the structural EM algorithm to DBNs; and a variety of different applications of DBNs. However, perhaps the main value of the thesis is its catholic presentation of the field of sequential data modelling.
On sequential simulationbased methods for bayesian filtering
, 1998
"... Abstract. In this report, we present an overview of sequential simulationbased methods for Bayesian filtering of nonlinear and nonGaussian dynamic models. It includes in a general framework numerous methods proposed independently in various areas of science and proposes some original developments. ..."
Abstract

Cited by 251 (12 self)
 Add to MetaCart
(Show Context)
Abstract. In this report, we present an overview of sequential simulationbased methods for Bayesian filtering of nonlinear and nonGaussian dynamic models. It includes in a general framework numerous methods proposed independently in various areas of science and proposes some original developments.
Mixture Kalman filters
, 2000
"... In treating dynamic systems,sequential Monte Carlo methods use discrete samples to represent a complicated probability distribution and use rejection sampling, importance sampling and weighted resampling to complete the online `filtering' task. We propose a special sequential Monte Carlo metho ..."
Abstract

Cited by 224 (8 self)
 Add to MetaCart
In treating dynamic systems,sequential Monte Carlo methods use discrete samples to represent a complicated probability distribution and use rejection sampling, importance sampling and weighted resampling to complete the online `filtering' task. We propose a special sequential Monte Carlo method,the mixture Kalman filter, which uses a random mixture of the Gaussian distributions to approximate a target distribution. It is designed for online estimation and prediction of conditional and partial conditional dynamic linear models,which are themselves a class of widely used nonlinear systems and also serve to approximate many others. Compared with a few available filtering methods including Monte Carlo methods,the gain in efficiency that is provided by the mixture Kalman filter can be very substantial. Another contribution of the paper is the formulation of many nonlinear systems into conditional or partial conditional linear form,to which the mixture Kalman filter can be applied. Examples in target tracking and digital communications are given to demonstrate the procedures proposed.
The Unscented Particle Filter
, 2000
"... In this paper, we propose a new particle filter based on sequential importance sampling. The algorithm uses a bank of unscented filters to obtain the importance proposal distribution. This proposal has two very "nice" properties. Firstly, it makes efficient use of the latest available info ..."
Abstract

Cited by 211 (8 self)
 Add to MetaCart
In this paper, we propose a new particle filter based on sequential importance sampling. The algorithm uses a bank of unscented filters to obtain the importance proposal distribution. This proposal has two very "nice" properties. Firstly, it makes efficient use of the latest available information and, secondly, it can have heavy tails. As a result, we find that the algorithm outperforms standard particle filtering and other nonlinear filtering methods very substantially. This experimental finding is in agreement with the theoretical convergence proof for the algorithm. The algorithm also includes resampling and (possibly) Markov chain Monte Carlo (MCMC) steps.
Particle Filters for State Estimation of Jump Markov Linear Systems
, 2001
"... Jump Markov linear systems (JMLS) are linear systems whose parameters evolve with time according to a finite state Markov chain. In this paper, our aim is to recursively compute optimal state estimates for this class of systems. We present efficient simulationbased algorithms called particle filter ..."
Abstract

Cited by 177 (15 self)
 Add to MetaCart
(Show Context)
Jump Markov linear systems (JMLS) are linear systems whose parameters evolve with time according to a finite state Markov chain. In this paper, our aim is to recursively compute optimal state estimates for this class of systems. We present efficient simulationbased algorithms called particle filters to solve the optimal filtering problem as well as the optimal fixedlag smoothing problem. Our algorithms combine sequential importance sampling, a selection scheme, and Markov chain Monte Carlo methods. They use several variance reduction methods to make the most of the statistical structure of JMLS. Computer
RaoBlackwellised Particle Filtering for Fault Diagnosis
 IEEE Aerospace
, 2001
"... We tackle the fault diagnosis problem using conditionally Gaussian state space models and an efficient Monte Carlo method known as RaoBlackwellised particle filtering. In this setting, there is one different linearGaussian state space model for each possible discrete state of operation. The task of ..."
Abstract

Cited by 62 (1 self)
 Add to MetaCart
(Show Context)
We tackle the fault diagnosis problem using conditionally Gaussian state space models and an efficient Monte Carlo method known as RaoBlackwellised particle filtering. In this setting, there is one different linearGaussian state space model for each possible discrete state of operation. The task of diagnosis is to identify the discrete state of operation using the continuous measurements corrupted by Gaussian noise.
Particle filters for mixture models with an unknown number of components
 Statistics and Computing
, 2003
"... We consider the analysis of data under mixture models where the number of components in the mixture is unknown. We concentrate on mixture Dirichlet process models, and in particular we consider such models under conjugate priors. This conjugacy enables us to integrate out many of the parameters in t ..."
Abstract

Cited by 59 (3 self)
 Add to MetaCart
(Show Context)
We consider the analysis of data under mixture models where the number of components in the mixture is unknown. We concentrate on mixture Dirichlet process models, and in particular we consider such models under conjugate priors. This conjugacy enables us to integrate out many of the parameters in the model, and to discretize the posterior distribution. Particle filters are particularly well suited to such discrete problems, and we propose the use of the particle filter of Fearnhead and Clifford for this problem. The performance of this particle filter, when analyzing both simulated and real data from a Gaussian mixture model, is uniformly better than the particle filter algorithm of Chen and Liu. In many situations it outperforms a Gibbs Sampler. We also show how models without the required amount of conjugacy can be efficiently analyzed by the same particle filter algorithm.
RaoBlackwellized Particle Filter for Multiple Target Tracking
 Information Fusion Journal
, 2005
"... In this article we propose a new RaoBlackwellized particle filtering based algorithm for tracking an unknown number of targets. The algorithm is based on formulating probabilistic stochastic process models for target states, data associations, and birth and death processes. The tracking of these st ..."
Abstract

Cited by 52 (4 self)
 Add to MetaCart
(Show Context)
In this article we propose a new RaoBlackwellized particle filtering based algorithm for tracking an unknown number of targets. The algorithm is based on formulating probabilistic stochastic process models for target states, data associations, and birth and death processes. The tracking of these stochastic processes is implemented using sequential Monte Carlo sampling or particle filtering, and the e#ciency of the Monte Carlo sampling is improved by using RaoBlackwellization.
A Survey of Maneuvering Target Tracking  Part V: MultipleModel Methods
, 2003
"... ... without addressing the socalled measurementorigin uncertainty. Part I and Part II deal with target motion models. Part III covers measurement models and associated techniques. Part IV is concerned with tracking techniques that are based on decisions regarding target maneuvers. This part surv ..."
Abstract

Cited by 52 (2 self)
 Add to MetaCart
... without addressing the socalled measurementorigin uncertainty. Part I and Part II deal with target motion models. Part III covers measurement models and associated techniques. Part IV is concerned with tracking techniques that are based on decisions regarding target maneuvers. This part surveys the multiplemodel methodsthe use of multiple models (and filters) simultaneouslywhich is the prevailing approach to maneuvering target tracking in the recent years. The survey is presented in a structured way, centered around three generations of algorithms: autonomous, cooperating, and variable structure. It emphasizes on the underpinning of each algorithm and covers various issues in algorithm design, application, and performance.