Results 1 - 10
of
11
Variational inference for visual tracking
- in Conf. Computer Vision and Pattern Recog, CVPR’03
, 2003
"... The likelihood models used in probabilistic visual tracking applications are often complex non-linear and/or non-Gaussian functions, leading to analytically intractable inference. Solutions then require numerical approximation techniques, of which the particle filter is a popular choice. Particle fi ..."
Abstract
-
Cited by 27 (1 self)
- Add to MetaCart
(Show Context)
The likelihood models used in probabilistic visual tracking applications are often complex non-linear and/or non-Gaussian functions, leading to analytically intractable inference. Solutions then require numerical approximation techniques, of which the particle filter is a popular choice. Particle filters, however, degrade in performance as the dimensionality of the state space increases and the support of the likelihood decreases. As an alternative to particle filters this paper introduces a variational approximation to the tracking recursion. The variational inference is intractable in itself, and is combined with an efficient importance sampling procedure to obtain the required estimates. The algorithm is shown to compare favourably with particle filtering techniques on a synthetic example and two real tracking problems. The first involves the tracking of a designated object in a video sequence based on its colour properties, whereas the second involves contour extraction in a single image. 1.
Cutset sampling for Bayesian networks
- Journal of Artificial Intelligence Research
"... The paper presents a new sampling methodology for Bayesian networks that samples only a subset of variables and applies exact inference to the rest. Cutset sampling is a network structure-exploiting application of the Rao-Blackwellisation principle to sampling in Bayesian networks. It improves conve ..."
Abstract
-
Cited by 23 (7 self)
- Add to MetaCart
The paper presents a new sampling methodology for Bayesian networks that samples only a subset of variables and applies exact inference to the rest. Cutset sampling is a network structure-exploiting application of the Rao-Blackwellisation principle to sampling in Bayesian networks. It improves convergence by exploiting memory-based inference algorithms. It can also be viewed as an anytime approximation of the exact cutset-conditioning algorithm developed by Pearl. Cutset sampling can be implemented efficiently when the sampled variables constitute a loop-cutset of the Bayesian network and, more generally, when the induced width of the network’s graph conditioned on the observed sampled variables is bounded by a constant w. We demonstrate empirically the benefit of this scheme on a range of benchmarks. 1.
A Machine Learning Perspective on Predictive Coding with PAQ8 and New Applications
, 2011
"... The goal of this thesis is to describe a state-of-the-art compression method called PAQ8 from the perspective of machine learning. We show both how PAQ8 makes use of several simple, well known machine learning models and algorithms, and how it can be improved by exchanging these components for more ..."
Abstract
-
Cited by 2 (0 self)
- Add to MetaCart
(Show Context)
The goal of this thesis is to describe a state-of-the-art compression method called PAQ8 from the perspective of machine learning. We show both how PAQ8 makes use of several simple, well known machine learning models and algorithms, and how it can be improved by exchanging these components for more sophisticated models and algorithms. We also present a broad range of new applications of PAQ8 to machine learning tasks including language modeling and adaptive text prediction, adaptive game playing, classification, and lossy compression using features from the field of deep learning. ii Table of Contents Abstract....................................
An entropybased measurement of certainty in Rao-Blackwellized particle filter mapping
- in IROS
, 2006
"... Abstract – In Bayesian based approaches to mobile robot simultaneous localization and mapping, Rao-Blackwellized particle filters (RBPF) enable the efficient estimation of the posterior belief over robot poses and the map. These particle filters have been recently adopted by many exploration approac ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
Abstract – In Bayesian based approaches to mobile robot simultaneous localization and mapping, Rao-Blackwellized particle filters (RBPF) enable the efficient estimation of the posterior belief over robot poses and the map. These particle filters have been recently adopted by many exploration approaches, to whom a central issue is measuring the certainty inherent to a given estimation in order to be able to select robot actions that increase it. In this paper we propose a new certainty measurement grounded in Information Theory that unifies the two kinds of uncertainty which are intrinsic to SLAM: in the robot pose and in the map content. Most previous works have considered only one of them or a weighted average. Our method combines them more appropriately by first building an expected map (EM) which condenses all the current map hypotheses and then computing its mean information (MI) – an entropy derived measurement that quantifies the inconsistencies in the EM. Experimental results comparing our method (EMMI) with others verify its correctness and its better behavior for detecting the decrease in certainty when the robot enters unexplored areas and its increase after closing a loop. Index terms – Mobile robots, SLAM, particle filters, information theory, probabilistic mapping. I.
Bioinformatics Advance Access published May 31, 2007 at Pennsylvania State U
, 2016
"... atics.oxfordjournals.org/ ..."
BIOINFORMATICS Bayesian Modelling of Shared Gene Function
"... Motivation Biological assays are often carried out on tissues that contain many cell lineages and active pathways. Microarray data produced using such material therefore reflect superimpositions of biological processes. Analysing such data for shared gene function by means of well matched assays may ..."
Abstract
- Add to MetaCart
Motivation Biological assays are often carried out on tissues that contain many cell lineages and active pathways. Microarray data produced using such material therefore reflect superimpositions of biological processes. Analysing such data for shared gene function by means of well matched assays may help to provide a better focus on specific cell types and processes. The identification of genes that behave similarly in different biological systems also has the potential to reveal new insights into preserved biological mechanisms. Results In this paper we propose a hierarchical Bayesian model allowing integrated analysis of several microarray data sets for shared gene function. Each transcript is associated with an indicator variable that selects whether binary class labels are predicted from expression values or by a classifier which is common to all transcripts. Each indicator selects the component models for all involved data sets simultaneously. A quantitative measure of shared gene function is obtained by inferring a probability measure over these indicators. Through experiments on synthetic data we illustrate potential advantages of this Bayesian approach over a standard method. A shared analysis of matched microarray experiments covering a) a cycle of mouse mammary gland development and b) the process of endothelial cell apoptosis is proposed as a biological gold standard. Several useful sanity checks are introduced during data analysis and we confirm the prior biological belief that shared apoptosis events occur in both systems. We conclude that a Bayesian analysis for shared gene function has the potential to reveal new biological insights, unobtainable by other means. Availability An online supplement and MatLab code are available at
Cutset sampling for Bayesian networks Cutset sampling for Bayesian networks
"... The paper presents a new sampling methodology for Bayesian networks that samples only a subset of variables and applies exact inference to the rest. Cutset sampling is a network structure-exploiting application of the Rao-Blackwellisation principle to sampling in Bayesian networks. It improves conve ..."
Abstract
- Add to MetaCart
The paper presents a new sampling methodology for Bayesian networks that samples only a subset of variables and applies exact inference to the rest. Cutset sampling is a network structure-exploiting application of the Rao-Blackwellisation principle to sampling in Bayesian networks. It improves convergence by exploiting memory-based inference algorithms. It can also be viewed as an anytime approximation of exact cutset-conditioning algorithm (Pearl, 1988). Cutset sampling can be implemented efficiently when the sampled variables constitute a loop-cutset of the Bayesian network and, more generally, when the induced width of the network’s graph conditioned on the observed sampled variables is bounded by a constant w. We demonstrate empirically the benefit of this scheme on a range of benchmarks. 1.
Cutset Sampling with Likelihood Weighting
"... The paper extends the principle of cutset sampling over Bayesian networks, presented previously for Gibbs sampling, to likelihood weighting (LW). Cutset sampling is motivated by the Rao-Blackwell theorem which implies that sampling over a subset of variables requires fewer samples for convergence du ..."
Abstract
- Add to MetaCart
(Show Context)
The paper extends the principle of cutset sampling over Bayesian networks, presented previously for Gibbs sampling, to likelihood weighting (LW). Cutset sampling is motivated by the Rao-Blackwell theorem which implies that sampling over a subset of variables requires fewer samples for convergence due to the reduction in sampling variance. The scheme exploits the network structure in selecting cutsets that allow efficient computation of the sampling distributions. In particular, as we show empirically, likelihood weighting over a loop-cutset (abbreviated LWLC), is time-wise cost-effective. We also provide an effective way for caching the probabilities of the generated samples which improves the performance of the overall scheme. We compare LWLC against regular liklihood-weighting and against Gibbsbased cutset sampling. 1
BIOINFORMATICS ORIGINAL PAPER doi:10.1093/bioinformatics/btm280 Gene expression Bayesian modelling of shared gene function
"... Motivation: Biological assays are often carried out on tissues that contain many cell lineages and active pathways. Microarray data produced using such material therefore reflect superimpositions of biological processes. Analysing such data for shared gene function by means of well-matched assays ma ..."
Abstract
- Add to MetaCart
Motivation: Biological assays are often carried out on tissues that contain many cell lineages and active pathways. Microarray data produced using such material therefore reflect superimpositions of biological processes. Analysing such data for shared gene function by means of well-matched assays may help to provide a better focus on specific cell types and processes. The identification of genes that behave similarly in different biological systems also has the potential to reveal new insights into preserved biological mechanisms. Results: In this article, we propose a hierarchical Bayesian model allowing integrated analysis of several microarray data sets for shared gene function. Each gene is associated with an indicator variable that selects whether binary class labels are predicted from expression values or by a classifier which is common to all genes. Each indicator selects the component models for all involved data