Results 1  10
of
4,549
Decisiveness in Loopy Propagation
"... Abstract. When Pearl’s algorithm for reasoning with singly connected Bayesian networks is applied to a network with loops, the algorithm is no longer guaranteed to result in exact probabilities. We identify the two types of error that can arise in the probabilities yielded by the algorithm: the cycl ..."
Abstract
 Add to MetaCart
Abstract. When Pearl’s algorithm for reasoning with singly connected Bayesian networks is applied to a network with loops, the algorithm is no longer guaranteed to result in exact probabilities. We identify the two types of error that can arise in the probabilities yielded by the algorithm: the cycling error and the convergence error. We then focus on the cycling error and analyse its effect on the decisiveness of the approximations that are computed for the inner nodes of simple loops. More specifically, we detail the factors that induce the cycling error to push the exact probabilities towards over or underconfident approximations. 1
On the Convergence Error in Loopy Propagation
"... Reasoning with a Bayesian network amounts to computing probability distributions for the network’s variables. Pearl’s propagation algorithm provides for efficient reasoning with singly connected networks. When applied to a multiply connected network, the algorithm no longer yields exact probabilitie ..."
Abstract
 Add to MetaCart
Reasoning with a Bayesian network amounts to computing probability distributions for the network’s variables. Pearl’s propagation algorithm provides for efficient reasoning with singly connected networks. When applied to a multiply connected network, the algorithm no longer yields exact
Loopy Propagation: the Convergence Error in Markov Networks
"... Loopy propagation provides for approximate reasoning with Bayesian networks. In previous research, we distinguished between two different types of error in the probabilities yielded by the algorithm; the cycling error and the convergence error. Other researchers analysed an equivalent algorithm for ..."
Abstract
 Add to MetaCart
Loopy propagation provides for approximate reasoning with Bayesian networks. In previous research, we distinguished between two different types of error in the probabilities yielded by the algorithm; the cycling error and the convergence error. Other researchers analysed an equivalent algorithm
Enhancing Demosaicking Algorithms using Loopy Propagation
"... photography Consumerlevel digital cameras observe a single value at each pixel. The remaining two channels of a threechannel image are reconstructed through a process called demosaicking. This paper describes a methodology for enhancing current demosaicking methods. Using an iterative relaxation a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
photography Consumerlevel digital cameras observe a single value at each pixel. The remaining two channels of a threechannel image are reconstructed through a process called demosaicking. This paper describes a methodology for enhancing current demosaicking methods. Using an iterative relaxation approach from probabilistic AI literature, our empirical results show that we can improve the results of the standard algorithms using monitored successive application of those algorithms. We apply the new technique to several algorithms: huebased interpolation, gradientbased interpolation, and adaptive colour plan interpolation; and we show a significant improvement in meansquared error over both RGB and CIE colour spaces using each of these algorithms. 1
Loopy Belief Propagation for Approximate Inference: An Empirical Study
 In Proceedings of Uncertainty in AI
, 1999
"... Recently, researchers have demonstrated that "loopy belief propagation"  the use of Pearl's polytree algorithm in a Bayesian network with loops  can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannonlimit performa ..."
Abstract

Cited by 680 (18 self)
 Add to MetaCart
Recently, researchers have demonstrated that "loopy belief propagation"  the use of Pearl's polytree algorithm in a Bayesian network with loops  can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannon
Generation of Random Bayesian Networks with Constraints on Induced Width, with Application to the Average Analysis of dConnectivity, Quasirandom Sampling, and Loopy Propagation
 In Proceedings of the 16th Eureopean Conference on Artificial Intelligence
, 2004
"... We present algorithms for the generation of uniformly distributed Bayesian networks with constraints on induced width. The algorithms use ergodic Markov chains to generate samples, building upon previous algorithms by the authors. The introduction of constraints on induced width leads to more rea ..."
Abstract

Cited by 29 (1 self)
 Add to MetaCart
realistic results but requires new techniques. We discuss three applications of randomly generated networks: we study the average number of nodes dconnected to a query, the effectiveness of quasirandom samples in approximate inference, and the convergence of loopy propagation for non
Belief Propagation
, 2010
"... When a pair of nuclearpowered Russian submarines was reported patrolling off the eastern seaboard of the U.S. last summer, Pentagon officials expressed wariness over the Kremlin’s motivations. At the same time, these officials emphasized their confidence in the U.S. Navy’s tracking capabilities: “W ..."
Abstract

Cited by 479 (11 self)
 Add to MetaCart
When a pair of nuclearpowered Russian submarines was reported patrolling off the eastern seaboard of the U.S. last summer, Pentagon officials expressed wariness over the Kremlin’s motivations. At the same time, these officials emphasized their confidence in the U.S. Navy’s tracking capabilities: “We’ve known where they were,” a senior Defense Department official told the New York Times, “and we’re not concerned about our ability to track the subs.” While the official did not divulge the methods used by the Navy to track submarines, the Times added that such
Loopy Belief Propagation and Gibbs Measures
 In Uncertainty in Artificial Intelligence
, 2002
"... We address the question of convergence in the loopy belief propagation (LBP) algorithm. ..."
Abstract

Cited by 103 (6 self)
 Add to MetaCart
We address the question of convergence in the loopy belief propagation (LBP) algorithm.
Dynamic Bayesian Networks: Representation, Inference and Learning
, 2002
"... Modelling sequential data is important in many areas of science and engineering. Hidden Markov models (HMMs) and Kalman filter models (KFMs) are popular for this because they are simple and flexible. For example, HMMs have been used for speech recognition and biosequence analysis, and KFMs have bee ..."
Abstract

Cited by 758 (3 self)
 Add to MetaCart
) space instead of O(T); a simple way of using the junction tree algorithm for online inference in DBNs; new complexity bounds on exact online inference in DBNs; a new deterministic approximate inference algorithm called factored frontier; an analysis of the relationship between the BK algorithm and loopy
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 800 (26 self)
 Add to MetaCart
likelihoods, marginal probabilities and most probable configurations. We describe how a wide varietyof algorithms — among them sumproduct, cluster variational methods, expectationpropagation, mean field methods, maxproduct and linear programming relaxation, as well as conic programming relaxations — can
Results 1  10
of
4,549