Results 1  10
of
15
On Sequential Monte Carlo Sampling Methods for Bayesian Filtering
 STATISTICS AND COMPUTING
, 2000
"... In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is develop ..."
Abstract

Cited by 1051 (76 self)
 Add to MetaCart
(Show Context)
In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is developed that unifies many of the methods which have been proposed over the last few decades in several different scientific disciplines. Novel extensions to the existing methods are also proposed. We show in particular how to incorporate local linearisation methods similar to those which have previously been employed in the deterministic filtering literature; these lead to very effective importance distributions. Furthermore we describe a method which uses RaoBlackwellisation in order to take advantage of the analytic structure present in some important classes of statespace models. In a final section we develop algorithms for prediction, smoothing and evaluation of the likelihood in dynamic models.
On sequential simulationbased methods for bayesian filtering
, 1998
"... Abstract. In this report, we present an overview of sequential simulationbased methods for Bayesian filtering of nonlinear and nonGaussian dynamic models. It includes in a general framework numerous methods proposed independently in various areas of science and proposes some original developments. ..."
Abstract

Cited by 251 (12 self)
 Add to MetaCart
(Show Context)
Abstract. In this report, we present an overview of sequential simulationbased methods for Bayesian filtering of nonlinear and nonGaussian dynamic models. It includes in a general framework numerous methods proposed independently in various areas of science and proposes some original developments.
The Unscented Particle Filter
, 2000
"... In this paper, we propose a new particle filter based on sequential importance sampling. The algorithm uses a bank of unscented filters to obtain the importance proposal distribution. This proposal has two very "nice" properties. Firstly, it makes efficient use of the latest available info ..."
Abstract

Cited by 211 (8 self)
 Add to MetaCart
In this paper, we propose a new particle filter based on sequential importance sampling. The algorithm uses a bank of unscented filters to obtain the importance proposal distribution. This proposal has two very "nice" properties. Firstly, it makes efficient use of the latest available information and, secondly, it can have heavy tails. As a result, we find that the algorithm outperforms standard particle filtering and other nonlinear filtering methods very substantially. This experimental finding is in agreement with the theoretical convergence proof for the algorithm. The algorithm also includes resampling and (possibly) Markov chain Monte Carlo (MCMC) steps.
Particle Filters for State Estimation of Jump Markov Linear Systems
, 2001
"... Jump Markov linear systems (JMLS) are linear systems whose parameters evolve with time according to a finite state Markov chain. In this paper, our aim is to recursively compute optimal state estimates for this class of systems. We present efficient simulationbased algorithms called particle filter ..."
Abstract

Cited by 177 (15 self)
 Add to MetaCart
(Show Context)
Jump Markov linear systems (JMLS) are linear systems whose parameters evolve with time according to a finite state Markov chain. In this paper, our aim is to recursively compute optimal state estimates for this class of systems. We present efficient simulationbased algorithms called particle filters to solve the optimal filtering problem as well as the optimal fixedlag smoothing problem. Our algorithms combine sequential importance sampling, a selection scheme, and Markov chain Monte Carlo methods. They use several variance reduction methods to make the most of the statistical structure of JMLS. Computer
Sequential Monte Carlo Methods to Train Neural Network Models
, 2000
"... We discuss a novel strategy for training neural networks using sequential Monte Carlo algorithms and propose a new hybrid gradient descent/ sampling importance resampling algorithm (HySIR). In terms of computational time and accuracy, the hybrid SIR is a clear improvement over conventional sequentia ..."
Abstract

Cited by 45 (7 self)
 Add to MetaCart
We discuss a novel strategy for training neural networks using sequential Monte Carlo algorithms and propose a new hybrid gradient descent/ sampling importance resampling algorithm (HySIR). In terms of computational time and accuracy, the hybrid SIR is a clear improvement over conventional sequential Monte Carlo techniques. The new algorithm may be viewed as a global optimization strategy that allows us to learn the probability distributions of the network weights and outputs in a sequential framework. It is well suited to applications involving online, nonlinear, and nongaussian signal processing. We show how the new algorithm outperforms extended Kalman filter training on several problems. In particular, we address the problem of pricing option contracts, traded in financial markets. In this context, we are able to estimate the onestepahead probability density functions of the options prices.
Optimal Estimation And CramerRao Bounds For Partial NonGaussian State Space Models
, 2001
"... Partial nonGaussian statespace models include many models of inter est while keeping a convenient analytical structure. In this paper, two problems related to partial nonGaussian models are addressed. First, we present an efficient sequential Monte Carlo method to perform Bayesian inference. ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
Partial nonGaussian statespace models include many models of inter est while keeping a convenient analytical structure. In this paper, two problems related to partial nonGaussian models are addressed. First, we present an efficient sequential Monte Carlo method to perform Bayesian inference. Second, we derive simple recursions to conpute posterior Cramr~Rao bounds (PCRB). An application to jump Markov linear systems (JMLS) is given.
Sequential Monte Carlo Methods For Optimisation Of Neural Network Models
, 1998
"... We discuss a novel strategy for training neural networks using sequential Monte Carlo algorithms and propose a new hybrid gradient descent/sampling importance resampling algorithm (HySIR). In terms of both computational time and accuracy, the hybrid SIR is a clear improvement over conventional seque ..."
Abstract

Cited by 13 (0 self)
 Add to MetaCart
We discuss a novel strategy for training neural networks using sequential Monte Carlo algorithms and propose a new hybrid gradient descent/sampling importance resampling algorithm (HySIR). In terms of both computational time and accuracy, the hybrid SIR is a clear improvement over conventional sequential Monte Carlo techniques. The new algorithm may be viewed as a global optimisation strategy, which allows us to learn the probability distributions of the network weights and outputs in a sequential framework. It is well suited to applications involving online, nonlinear and nonGaussian signal processing. We show how the new algorithm outperforms extended Kalman filter training on several problems. In particular, we address the problem of pricing option contracts, traded in financial markets. In this context, we are able to estimate the onestepahead probability density functions of the options prices.
Bayesian Methods for Neural Networks
, 1999
"... Summary The application of the Bayesian learning paradigm to neural networks results in a flexible and powerful nonlinear modelling framework that can be used for regression, density estimation, prediction and classification. Within this framework, all sources of uncertainty are expressed and meas ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
(Show Context)
Summary The application of the Bayesian learning paradigm to neural networks results in a flexible and powerful nonlinear modelling framework that can be used for regression, density estimation, prediction and classification. Within this framework, all sources of uncertainty are expressed and measured by probabilities. This formulation allows for a probabilistic treatment of our a priori knowledge, domain specific knowledge, model selection schemes, parameter estimation methods and noise estimation techniques. Many researchers have contributed towards the development of the Bayesian learning approach for neural networks. This thesis advances this research by proposing several novel extensions in the areas of sequential learning, model selection, optimisation and convergence assessment. The first contribution is a regularisation strategy for sequential learning based on extended Kalman filtering and noise estimation via evidence maximisation. Using the expectation maximisation (EM) algorithm, a similar algorithm is derived for batch learning. Much of the thesis is, however, devoted to Monte Carlo simulation methods. A robust Bayesian method is proposed to estimate,
Runtime Verification with Particle Filtering
"... Abstract. We introduce Runtime Verification with Particle Filtering (RVPF), a powerful and versatile method for controlling the tradeoff between uncertainty and overhead in runtime verification. Overhead and accuracy are controlled by adjusting the frequency and duration of observation gaps, during ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Abstract. We introduce Runtime Verification with Particle Filtering (RVPF), a powerful and versatile method for controlling the tradeoff between uncertainty and overhead in runtime verification. Overhead and accuracy are controlled by adjusting the frequency and duration of observation gaps, during which program events are not monitored, and by adjusting the number of particles used in the RVPF algorithm. We succinctly represent the program model, the program monitor, their interaction, and their observations as a dynamic Bayesian network (DBN). Our formulation of RVPF in terms of DBNs is essential for a proper formalization of peek events: lowcost observations of parts of the program state, which are performed probabilistically at the end of observation gaps. Peek events provide information that our algorithm uses to reduce the uncertainty in the monitor state after gaps. We estimate the internal state of the DBN using particle filtering (PF) with sequential importance resampling (SIR). PF uses a collection of conceptual particles (random samples) to estimate the probability distribution for the system’s current state: the probability of a state is given by the sum of the importance weights of the particles in that state. After an observed event, each particle chooses a state transition to execute by sampling the DBN’s joint transition probability distribution; particles are then redistributed among the states that best predicted the current observation. SIR exploits the DBN structure and the current observation to reduce the variance of the PF and increase its performance. We experimentally compare the overhead and accuracy of our RVPF algorithm with two previous approaches to runtime verification with state estimation: an exact algorithm based on the forward algorithm for HMMs, and an approximate version of that algorithm, which uses precomputation to reduce runtime overhead. Our results confim RVPF’s versatility, showing how it can be used to control the tradeoff between execution time and memory usage while, at the same time, being the most accurate of the three algorithms. 1
PROBABILITY HYPOTHESIS DENSITY FILTERING FOR REALTIME TRAFFIC STATE ESTIMATION AND PREDICTION
"... The probability hypothesis density (PHD) methodology is widely used by the research community for the purposes of multiple object tracking. This problem consists in the recursive state estimation of several targets by using the information coming from an observation process. The purpose of this pap ..."
Abstract
 Add to MetaCart
The probability hypothesis density (PHD) methodology is widely used by the research community for the purposes of multiple object tracking. This problem consists in the recursive state estimation of several targets by using the information coming from an observation process. The purpose of this paper is to investigate the potential of the PHD filters for realtime traffic state estimation. This investigation is based on a Cell Transmission Model (CTM) coupled with the PHD filter. It brings a novel tool to the state estimation problem and allows to estimate the densities in traffic networks in the presence of measurement origin uncertainty, detection uncertainty and noises. In this work, we compare the PHD filter performance with a particle filter (PF), both taking into account the measurement origin uncertainty and show that they can provide high accuracy in a traffic setting and realtime computational costs. The PHD filtering framework opens new research avenues and has the abilities to solve challenging problems of vehicular networks.