Results 11  20
of
132
Loopy belief propagation for approximate inference: An empirical study. In:
 Proceedings of Uncertainty in AI,
, 1999
"... Abstract Recently, researchers have demonstrated that "loopy belief propagation" the use of Pearl's polytree algorithm in a Bayesian network with loops can perform well in the context of errorcorrecting codes. The most dramatic instance of this is the near Shannonlimit performanc ..."
Abstract

Cited by 676 (15 self)
 Add to MetaCart
the convergence the more exact the approximation. • If the hidden nodes are binary, then thresholding the loopy beliefs is guaranteed to give the most probable assignment, even though the numerical value of the beliefs may be incorrect. This result only holds for nodes in the loop. In the maxproduct (or "
On the fixed points of the maxproduct algorithm
, 2000
"... Graphical models, such as Bayesian networks and Markov random fields, represent statistical dependencies of variables by a graph. The maxproduct "belief propagation" algorithm is a localmessage passing algorithm on this graph that is known to converge to a unique fixed point when the g ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
Graphical models, such as Bayesian networks and Markov random fields, represent statistical dependencies of variables by a graph. The maxproduct "belief propagation" algorithm is a localmessage passing algorithm on this graph that is known to converge to a unique fixed point when
A comparative study of energy minimization methods for Markov random fields
 IN ECCV
, 2006
"... One of the most exciting advances in early vision has been the development of efficient energy minimization algorithms. Many early vision tasks require labeling each pixel with some quantity such as depth or texture. While many such problems can be elegantly expressed in the language of Markov Ran ..."
Abstract

Cited by 415 (36 self)
 Add to MetaCart
the solution quality and running time of several common energy minimization algorithms. We investigate three promising recent methods—graph cuts, LBP, and treereweighted message passing—as well as the wellknown older iterated conditional modes (ICM) algorithm. Our benchmark problems are drawn from published
SubproblemTree Calibration: A Unified Approach to MaxProduct Message Passing
"... Maxproduct (maxsum) message passing algorithms are widely used for MAP inference in MRFs. It has many variants sharing a common flavor of passing “messages” over some graphobject. Recent advances revealed that its convergent versions (such as MPLP, MSD, TRWS) can be viewed as performing block co ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Maxproduct (maxsum) message passing algorithms are widely used for MAP inference in MRFs. It has many variants sharing a common flavor of passing “messages” over some graphobject. Recent advances revealed that its convergent versions (such as MPLP, MSD, TRWS) can be viewed as performing block
Robust maxproduct belief propagation
 In Signals, Systems and Computers (ASILOMAR), 2011 Conference Record of the Forty Fifth Asilomar Conference on. IEEE
, 2011
"... Abstract—We study the problem of optimizing a graphstructured objective function under adversarial uncertainty. This problem can be modeled as a twopersons zerosum game between an Engineer and Nature. The Engineer controls a subset of the variables (nodes in the graph), and tries to assign their ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(distributions over variable assignments) can be chosen in such a way to satisfy the Markov property with respect to the graph. This significantly reduces the problem dimensionality. Finally we introduce a message passing algorithm to solve this minimax problem. The algorithm generalizes maxproduct belief
Treereweighted belief propagation algorithms and approximate ML estimation by pseudomoment matching
 In AISTATS
, 2003
"... In previous work [10], we presented a class of upper bounds on the log partition function of an arbitrary undirected graphical model based on solving a convex variational problem. Here we develop a class of local messagepassing algorithms, which we call treereweighted belief propagation, for ..."
Abstract

Cited by 76 (4 self)
 Add to MetaCart
In previous work [10], we presented a class of upper bounds on the log partition function of an arbitrary undirected graphical model based on solving a convex variational problem. Here we develop a class of local messagepassing algorithms, which we call treereweighted belief propagation
MAP estimation via agreement on (hyper)trees: Messagepassing and linear programming approaches
 IEEE Transactions on Information Theory
, 2002
"... We develop an approach for computing provably exact maximum a posteriori (MAP) configurations for a subclass of problems on graphs with cycles. By decomposing the original problem into a convex combination of treestructured problems, we obtain an upper bound on the optimal value of the original ..."
Abstract

Cited by 147 (10 self)
 Add to MetaCart
configuration must also be a MAP configuration for the original problem. Next we present and analyze two methods for attempting to obtain tight upper bounds: (a) a treereweighted messagepassing algorithm that is related to but distinct from the maxproduct (minsum) algorithm; and (b) a treerelaxed linear
Maximum weight matching via maxproduct belief propagation
 in International Symposium of Information Theory
, 2005
"... Abstract — The maxproduct “belief propagation ” algorithm is an iterative, local, message passing algorithm for finding the maximum a posteriori (MAP) assignment of a discrete probability distribution specified by a graphical model. Despite the spectacular success of the algorithm in many applicati ..."
Abstract

Cited by 63 (12 self)
 Add to MetaCart
Abstract — The maxproduct “belief propagation ” algorithm is an iterative, local, message passing algorithm for finding the maximum a posteriori (MAP) assignment of a discrete probability distribution specified by a graphical model. Despite the spectacular success of the algorithm in many
Messagepassing and linear programming approaches
, 2003
"... We develop and analyze methods for computing provably optimal maximum a posteriori (MAP) configurations for a subclass of Markov random fields defined on graphs with cycles. By decomposing the original distribution into a convex combination of treestructured distributions, we obtain an upper bound ..."
Abstract
 Add to MetaCart
is that any such shared configuration must also be a MAP configuration for the original distribution. Next we develop two approaches to attempting to obtain tight upper bounds: (a) a treerelaxed linear program (LP), which is derived from the Lagrangian dual of the upper bounds; and (b) a treereweighted
Results 11  20
of
132