Results 1  10
of
157
A tutorial on particle filters for online nonlinear/nonGaussian Bayesian tracking
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 2002
"... Increasingly, for many application areas, it is becoming important to include elements of nonlinearity and nonGaussianity in order to model accurately the underlying dynamics of a physical system. Moreover, it is typically crucial to process data online as it arrives, both from the point of view o ..."
Abstract

Cited by 2006 (2 self)
 Add to MetaCart
(Show Context)
Increasingly, for many application areas, it is becoming important to include elements of nonlinearity and nonGaussianity in order to model accurately the underlying dynamics of a physical system. Moreover, it is typically crucial to process data online as it arrives, both from the point of view of storage costs as well as for rapid adaptation to changing signal characteristics. In this paper, we review both optimal and suboptimal Bayesian algorithms for nonlinear/nonGaussian tracking problems, with a focus on particle filters. Particle filters are sequential Monte Carlo methods based on point mass (or “particle”) representations of probability densities, which can be applied to any statespace model and which generalize the traditional Kalman filtering methods. Several variants of the particle filter such as SIR, ASIR, and RPF are introduced within a generic framework of the sequential importance sampling (SIS) algorithm. These are discussed and compared with the standard EKF through an illustrative example.
On Sequential Monte Carlo Sampling Methods for Bayesian Filtering
 STATISTICS AND COMPUTING
, 2000
"... In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is develop ..."
Abstract

Cited by 1051 (76 self)
 Add to MetaCart
(Show Context)
In this article, we present an overview of methods for sequential simulation from posterior distributions. These methods are of particular interest in Bayesian filtering for discrete time dynamic models that are typically nonlinear and nonGaussian. A general importance sampling framework is developed that unifies many of the methods which have been proposed over the last few decades in several different scientific disciplines. Novel extensions to the existing methods are also proposed. We show in particular how to incorporate local linearisation methods similar to those which have previously been employed in the deterministic filtering literature; these lead to very effective importance distributions. Furthermore we describe a method which uses RaoBlackwellisation in order to take advantage of the analytic structure present in some important classes of statespace models. In a final section we develop algorithms for prediction, smoothing and evaluation of the likelihood in dynamic models.
Implementing approximate Bayesian inference for latent Gaussian models using integrated nested Laplace approximations: A manual for the inlaprogram
, 2008
"... Structured additive regression models are perhaps the most commonly used class of models in statistical applications. It includes, among others, (generalised) linear models, (generalised) additive models, smoothingspline models, statespace models, semiparametric regression, spatial and spatiotemp ..."
Abstract

Cited by 294 (20 self)
 Add to MetaCart
(Show Context)
Structured additive regression models are perhaps the most commonly used class of models in statistical applications. It includes, among others, (generalised) linear models, (generalised) additive models, smoothingspline models, statespace models, semiparametric regression, spatial and spatiotemporal models, logGaussian Coxprocesses, geostatistical and geoadditive models. In this paper we consider approximate Bayesian inference in a popular subset of structured additive regression models, latent Gaussian models, where the latent field is Gaussian, controlled by a few hyperparameters and with nonGaussian response variables. The posterior marginals are not available in closed form due to the nonGaussian response variables. For such models, Markov chain Monte Carlo methods can be implemented, but they are not without problems, both in terms of convergence and computational time. In some practical applications, the extent of these problems is such that Markov chain Monte Carlo is simply not an appropriate tool for routine analysis. We show that, by using an integrated nested Laplace approximation and its simplified version, we can directly compute very accurate approximations to the posterior marginals. The main benefit of these approximations
Statistical algorithms for models in state space using SstPack 2.2
 ECONOMETRICS JOURNAL (1999), VOLUME 2, PP. 113–166.
, 1999
"... This paper discusses and documents the algorithms of SsfPack 2.2. SsfPack is a suite of C routines for carrying out computations involving the statistical analysis of univariate and multivariate models in state space form. The emphasis is on documenting the link we have made to the Ox computing en ..."
Abstract

Cited by 201 (34 self)
 Add to MetaCart
This paper discusses and documents the algorithms of SsfPack 2.2. SsfPack is a suite of C routines for carrying out computations involving the statistical analysis of univariate and multivariate models in state space form. The emphasis is on documenting the link we have made to the Ox computing environment. SsfPack allows for a full range of different state space forms: from a simple timeinvariant model to a complicated timevarying model. Functions can be used which put standard models such as ARMA and cubic spline models in state space form. Basic functions are available for filtering, moment smoothing and simulation smoothing. Readytouse functions are provided for standard tasks such as likelihood evaluation, forecasting and signal extraction. We show that SsfPack can be easily used for implementing, fitting and analysing Gaussian models relevant to many areas of econometrics and statistics. Some Gaussian illustrations are given.
A point process framework for relating neural spiking activity to spiking history, neural ensemble, and extrinsic covariate effects,”
 J Neurophysiol.,
, 2005
"... ..."
(Show Context)
Particle Filters for State Estimation of Jump Markov Linear Systems
, 2001
"... Jump Markov linear systems (JMLS) are linear systems whose parameters evolve with time according to a finite state Markov chain. In this paper, our aim is to recursively compute optimal state estimates for this class of systems. We present efficient simulationbased algorithms called particle filter ..."
Abstract

Cited by 177 (15 self)
 Add to MetaCart
Jump Markov linear systems (JMLS) are linear systems whose parameters evolve with time according to a finite state Markov chain. In this paper, our aim is to recursively compute optimal state estimates for this class of systems. We present efficient simulationbased algorithms called particle filters to solve the optimal filtering problem as well as the optimal fixedlag smoothing problem. Our algorithms combine sequential importance sampling, a selection scheme, and Markov chain Monte Carlo methods. They use several variance reduction methods to make the most of the statistical structure of JMLS. Computer
Monte Carlo smoothing for nonlinear time series
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 2004
"... We develop methods for performing smoothing computations in general statespace models. The methods rely on a particle representation of the filtering distributions, and their evolution through time using sequential importance sampling and resampling ideas. In particular, novel techniques are pr ..."
Abstract

Cited by 153 (16 self)
 Add to MetaCart
(Show Context)
We develop methods for performing smoothing computations in general statespace models. The methods rely on a particle representation of the filtering distributions, and their evolution through time using sequential importance sampling and resampling ideas. In particular, novel techniques are presented for generation of sample realizations of historical state sequences. This is carried out in a forwardfiltering backwardsmoothing procedure which can be viewed as the nonlinear, nonGaussian counterpart of standard Kalman filterbased simulation smoothers in the linear Gaussian case. Convergence in the meansquared error sense of the smoothed trajectories is proved, showing the validity of our proposed method. The methods are tested in a substantial application for the processing of speech signals represented by a timevarying autoregression and parameterised in terms of timevarying partial correlation coe#cients, comparing the results of our algorithm with those from a simple smoother based upon the filtered trajectories.
Akaike’s information criterion and recent developments in information complexity
 Journal of Mathematical Psychology
"... criterion (AIC). Then, we present some recent developments on a new entropic or information complexity (ICOMP) criterion of Bozdogan (1988a, 1988b, 1990, 1994d, 1996, 1998a, 1998b) for model selection. A rationale for ICOMP as a model selection criterion is that it combines a badnessoffit term (su ..."
Abstract

Cited by 113 (9 self)
 Add to MetaCart
criterion (AIC). Then, we present some recent developments on a new entropic or information complexity (ICOMP) criterion of Bozdogan (1988a, 1988b, 1990, 1994d, 1996, 1998a, 1998b) for model selection. A rationale for ICOMP as a model selection criterion is that it combines a badnessoffit term (such as minus twice the maximum log likelihood) with a measure of complexity of a model differently than AIC, or its variants, by taking into account the interdependencies of the parameter estimates as well as the dependencies of the model residuals. We operationalize the general form of ICOMP based on the quantification of the concept of overall model complexity in terms of the estimated inverseFisher information matrix. This approach results in an approximation to the sum of two KullbackLeibler distances. Using the correlational form of the complexity, we further provide yet another form of ICOMP to take into account the interdependencies (i.e., correlations) among the parameter estimates of the model. Later, we illustrate the practical utility and the importance of this new model selection criterion by providing several
Estimating a StateSpace Model from Point Process Observations
, 2003
"... A widely used signal processing paradigm is the statespace model. The statespace model is defined by two equations: an observation equation that describes how the hidden state or latent process is observed and a state equation that defines the evolution of the process through time. Inspired by neu ..."
Abstract

Cited by 71 (9 self)
 Add to MetaCart
A widely used signal processing paradigm is the statespace model. The statespace model is defined by two equations: an observation equation that describes how the hidden state or latent process is observed and a state equation that defines the evolution of the process through time. Inspired by neurophysiology experiments in which neural spiking activity is induced by an implicit (latent) stimulus, we develop an algorithm to estimate a statespace model observed through point process measurements. We represent the latent process modulating the neural spiking activity as a gaussian autoregressive model driven by an external stimulus. Given the latent process, neural spiking activity is characterized as a general point process defined by its conditional intensity function. We develop an approximate expectationmaximization (EM) algorithm to estimate the unobservable statespace process, its parameters, and the parameters of the point process. The EM algorithm combines a point process recursive nonlinear filter algorithm, the fixed interval smoothing algorithm, and the statespace covariance algorithm to compute the complete data log likelihood efficiently. We use a KolmogorovSmirnov test based on the timerescaling theorem to evaluate agreement between the model and point process data. We illustrate the model with two simulated data examples: an ensemble of Poisson neurons driven by a common stimulus and a single neuron whose conditional intensity function is approximated as a local Bernoulli process.
Nonparametric estimation of large covariance matrices of longitudinal data.
 Biometrika,
, 2003
"... Summary Estimation of an unstructured covariance matrix is difficult because of its positivedefiniteness constraint. This obstacle is removed by regressing each variable on its predecessors, so that estimation of a covariance matrix is shown to be equivalent to that of estimating a sequence of vary ..."
Abstract

Cited by 70 (4 self)
 Add to MetaCart
(Show Context)
Summary Estimation of an unstructured covariance matrix is difficult because of its positivedefiniteness constraint. This obstacle is removed by regressing each variable on its predecessors, so that estimation of a covariance matrix is shown to be equivalent to that of estimating a sequence of varyingcoefficient and varyingorder regression models. Our framework is similar to the use of increasingorder autoregressive models in approximating the covariance matrix or the spectrum of a stationary time series. As an illustration, we adopt Fan & Zhang's (2000) twostep estimation of functional linear models and propose nonparametric estimators of covariance matrices which are guaranteed to be positive definite. For parsimony a suitable order for the sequence of (auto)regression models is found using penalised likelihood criteria like AIC and BIC. Some asymptotic results for the local polynomial estimators of components of a covariance matrix are established. Two longitudinal datasets are analysed to illustrate the methodology. A simulation study reveals the advantage of nonparametric covariance estimator over the sample covariance matrix for large covariance matrices.