Results 1  10
of
147
CONDENSATION  conditional density propagation for visual tracking
 International Journal of Computer Vision
, 1998
"... The problem of tracking curves in dense visual clutter is challenging. Kalman filtering is inadequate because it is based on Gaussian densities which, being unimodal, cannot represent simultaneous alternative hypotheses. The Condensation algorithm uses "factored sampling", previously appli ..."
Abstract

Cited by 1499 (12 self)
 Add to MetaCart
The problem of tracking curves in dense visual clutter is challenging. Kalman filtering is inadequate because it is based on Gaussian densities which, being unimodal, cannot represent simultaneous alternative hypotheses. The Condensation algorithm uses "factored sampling", previously applied to the interpretation of static images, in which the probability distribution of possible interpretations is represented by a randomly generated set. Condensation uses learned dynamical models, together with visual observations, to propagate the random set over time. The result is highly robust tracking of agile motion. Notwithstanding the use of stochastic methods, the algorithm runs in near realtime. Contents 1 Tracking curves in clutter 2 2 Discretetime propagation of state density 3 3 Factored sampling 6 4 The Condensation algorithm 8 5 Stochastic dynamical models for curve motion 10 6 Observation model 13 7 Applying the Condensation algorithm to videostreams 17 8 Conclusions 26 A Nonline...
Waveletbased statistical signal processing using hidden Markov models
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 1998
"... Waveletbased statistical signal processing techniques such as denoising and detection typically model the wavelet coefficients as independent or jointly Gaussian. These models are unrealistic for many realworld signals. In this paper, we develop a new framework for statistical signal processing b ..."
Abstract

Cited by 417 (55 self)
 Add to MetaCart
Waveletbased statistical signal processing techniques such as denoising and detection typically model the wavelet coefficients as independent or jointly Gaussian. These models are unrealistic for many realworld signals. In this paper, we develop a new framework for statistical signal processing based on waveletdomain hidden Markov models (HMM’s) that concisely models the statistical dependencies and nonGaussian statistics encountered in realworld signals. Waveletdomain HMM’s are designed with the intrinsic properties of the wavelet transform in mind and provide powerful, yet tractable, probabilistic signal models. Efficient expectation maximization algorithms are developed for fitting the HMM’s to observational signal data. The new framework is suitable for a wide range of applications, including signal estimation, detection, classification, prediction, and even synthesis. To demonstrate the utility of waveletdomain HMM’s, we develop novel algorithms for signal denoising, classification, and detection.
The Gaussian mixture probability hypothesis density filter
 IEEE Trans. SP
, 2006
"... Abstract — A new recursive algorithm is proposed for jointly estimating the timevarying number of targets and their states from a sequence of observation sets in the presence of data association uncertainty, detection uncertainty, noise and false alarms. The approach involves modelling the respecti ..."
Abstract

Cited by 140 (14 self)
 Add to MetaCart
(Show Context)
Abstract — A new recursive algorithm is proposed for jointly estimating the timevarying number of targets and their states from a sequence of observation sets in the presence of data association uncertainty, detection uncertainty, noise and false alarms. The approach involves modelling the respective collections of targets and measurements as random finite sets and applying the probability hypothesis density (PHD) recursion to propagate the posterior intensity, which is a first order statistic of the random finite set of targets, in time. At present, there is no closed form solution to the PHD recursion. This work shows that under linear, Gaussian assumptions on the target dynamics and birth process, the posterior intensity at any time step is a Gaussian mixture. More importantly, closed form recursions for propagating the means, covariances and weights of the constituent Gaussian components of the posterior intensity are derived. The proposed algorithm combines these recursions with a strategy for managing the number of Gaussian components to increase efficiency. This algorithm is extended to accommodate mildly nonlinear target dynamics using approximation strategies from the extended and unscented Kalman filters. Index Terms — Multitarget tracking, optimal filtering, point
Bayesian compressive sensing via belief propagation
 IEEE Trans. Signal Processing
, 2010
"... Compressive sensing (CS) is an emerging field based on the revelation that a small collection of linear projections of a sparse signal contains enough information for stable, subNyquist signal acquisition. When a statistical characterization of the signal is available, Bayesian inference can comple ..."
Abstract

Cited by 129 (19 self)
 Add to MetaCart
Compressive sensing (CS) is an emerging field based on the revelation that a small collection of linear projections of a sparse signal contains enough information for stable, subNyquist signal acquisition. When a statistical characterization of the signal is available, Bayesian inference can complement conventional CS methods based on linear programming or greedy algorithms. We perform approximate Bayesian inference using belief propagation (BP) decoding, which represents the CS encoding matrix as a graphical model. Fast encoding and decoding is provided using sparse encoding matrices, which also improve BP convergence by reducing the presence of loops in the graph. To decode a lengthN signal containing K large coefficients, our CSBP decoding algorithm uses O(K log(N)) measurements and O(N log 2 (N)) computation. Finally, sparse encoding matrices and the CSBP decoding algorithm can be modified to support a variety of signal models and measurement noise. 1
Bayesian Forecasting
, 1996
"... rapolation techniques, especially exponential smoothing and exponentially weighted moving average methods ([20, 71]). Developments of smoothing and discounting techniques in stock control and production planning areas led to formalisms in terms of linear, statespace models for time series with time ..."
Abstract

Cited by 95 (2 self)
 Add to MetaCart
rapolation techniques, especially exponential smoothing and exponentially weighted moving average methods ([20, 71]). Developments of smoothing and discounting techniques in stock control and production planning areas led to formalisms in terms of linear, statespace models for time series with timevarying trends and seasonal patterns, and eventually to the associated Bayesian formalism of methods of inference and prediction. From the early 1960s, practical Bayesian forecasting systems in this context involved the combination of formal time series models and historical data analysis together with methods for subjective intervention and forecast monitoring, so that complete forecasting systems, rather than just routine and automatic data analysis and extrapolation, were in use at that time ([19, 22]). Methods developed in those early days are still in use now in some companies in sales forecasting and stock control areas. There have been major developments in models and methods since t
Distributional assumptions of growth mixture models: Implications for overextraction of latent trajectory classes
 Psychological Methods
, 2003
"... Growth mixture models are often used to determine if subgroups exist within the population that follow qualitatively distinct developmental trajectories. However, statistical theory developed for finite normal mixture models suggests that latent trajectory classes can be estimated even in the absenc ..."
Abstract

Cited by 87 (8 self)
 Add to MetaCart
(Show Context)
Growth mixture models are often used to determine if subgroups exist within the population that follow qualitatively distinct developmental trajectories. However, statistical theory developed for finite normal mixture models suggests that latent trajectory classes can be estimated even in the absence of population heterogeneity if the distribution of the repeated measures is nonnormal. By drawing on this theory, this article demonstrates that multiple trajectory classes can be estimated and appear optimal for nonnormal data even when only 1 group exists in the population. Further, the withinclass parameter estimates obtained from these models are largely uninterpretable. Significant predictive relationships may be obscured or spurious relationships identified. The implications of these results for applied research are highlighted, and future directions for quantitative developments are suggested. Over the last decade, random coefficient growth modeling has become a centerpiece of longitudinal data analysis. These models have been adopted enthusiastically by applied psychological researchers in part because they provide a more dynamic analysis of repeated measures data than do many traditional techniques. However, these methods are not ideally suited for testing theories that posit the existence of qualitatively different developmental pathways, that is, theories in which distinct developmental pathways are thought to hold within subpopulations. One widely cited theory of this type is Moffitt’s (1993) distinction between “lifecourse persistent ” and “adolescentlimited ” antisocial behavior trajectories. Moffitt’s theory is prototypical of other developmental taxonomies that have been proposed in such diverse areas as developmental psychopathology (Schulenberg,
Gaussian sum particle filtering
 Signal Processing 51
, 2003
"... Abstract—In this paper, we use the Gaussian particle filter introduced in a companion paper to build several types of Gaussian sum particle filters. These filters approximate the filtering and predictive distributions by weighted Gaussian mixtures and are basically banks of Gaussian particle filters ..."
Abstract

Cited by 70 (3 self)
 Add to MetaCart
(Show Context)
Abstract—In this paper, we use the Gaussian particle filter introduced in a companion paper to build several types of Gaussian sum particle filters. These filters approximate the filtering and predictive distributions by weighted Gaussian mixtures and are basically banks of Gaussian particle filters. Then, we extend the use of Gaussian particle filters and Gaussian sum particle filters to dynamic state space (DSS) models with nonGaussian noise. With nonGaussian noise approximated by Gaussian mixtures, the nonGaussian noise models are approximated by banks of Gaussian noise models, and Gaussian mixture filters are developed using algorithms developed for Gaussian noise DSS models. 1 As a result, problems involving heavytailed densities can be conveniently addressed. Simulations are presented to exhibit the application of the framework developed herein, and the performance of the algorithms is examined. Index Terms—Dynamic statespace models, extended Kalman
Compressed Sensing Reconstruction via Belief Propagation
, 2006
"... Compressed sensing is an emerging field that enables to reconstruct sparse or compressible signals from a small number of linear projections. We describe a specific measurement scheme using an LDPClike measurement matrix, which is a realvalued analogue to LDPC techniques over a finite alphabet. We ..."
Abstract

Cited by 60 (9 self)
 Add to MetaCart
(Show Context)
Compressed sensing is an emerging field that enables to reconstruct sparse or compressible signals from a small number of linear projections. We describe a specific measurement scheme using an LDPClike measurement matrix, which is a realvalued analogue to LDPC techniques over a finite alphabet. We then describe the reconstruction details for mixture Gaussian signals. The technique can be extended to additional compressible signal models. 1
A hybrid bootstrap filter for target tracking in clutter
 IEEE Trans on Aerospace and Electronic Systems
, 1997
"... ..."
(Show Context)
Sequential Monte Carlo Filters and Integrated Navigation
, 2002
"... In this thesis we consider recursive Bayesian estimation in general, and sequential Monte Carlo filters in particular, applied to integrated navigation. Based on a large number of simulations of the model, the sequential Monte Carlo lter, also referred to as particle filter, provides an empirical es ..."
Abstract

Cited by 46 (3 self)
 Add to MetaCart
In this thesis we consider recursive Bayesian estimation in general, and sequential Monte Carlo filters in particular, applied to integrated navigation. Based on a large number of simulations of the model, the sequential Monte Carlo lter, also referred to as particle filter, provides an empirical estimate of the full posterior probability density of the system. The particle filter provide a solution to the general nonlinear, nonGaussian ltering problem. The more nonlinear system, or the more nonGaussian noise, the more potential particle filters have. Although very promising even for highdimensional systems, sequential Monte Carlo methods suer from being more or less computer intensive. However, many systems can be divided into two parts, where the first part is nonlinear and the second is (almost) linear conditionally upon the first. By applying the particle filter only on the severly nonlinear part of lower dimension, the computational load can be significantly reduced. For the remaining conditionally (almost) linear part we apply (linearized) linear filters, such as the (extended) Kalman filter. From a