Results 1  10
of
20,953
Convexifying the bethe free energy
 in Conference on Uncertainty in Artifical Intelligence (UAI
, 2009
"... The introduction of loopy belief propagation (LBP) revitalized the application of graphical models in many domains. Many recent works present improvements on the basic LBP algorithm in an attempt to overcome convergence and local optima problems. Notable among these are convexified free energy appro ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
the Bethe free energy. We show that the proposed approximations compare favorably with stateofthe art convex free energy approximations. 1
Bounds on the Bethe Free Energy for Gaussian Networks
"... We address the problem of computing approximate marginals in Gaussian probabilistic models by using mean field and fractional Bethe approximations. As an extension of Welling and Teh (2001), we define the Gaussian fractional Bethe free energy in terms of the moment parameters of the approximate marg ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We address the problem of computing approximate marginals in Gaussian probabilistic models by using mean field and fractional Bethe approximations. As an extension of Welling and Teh (2001), we define the Gaussian fractional Bethe free energy in terms of the moment parameters of the approximate
Stable fixed points of loopy belief propagation are minima of the Bethe free energy
, 2002
"... We extend recent work on the connection between loopy belief propagation and the Bethe free energy. Constrained minimization of the Bethe free energy can be turned into an unconstrained saddlepoint problem. Both converging doubleloop algorithms and standard loopy belief propagation can be interpre ..."
Abstract

Cited by 71 (8 self)
 Add to MetaCart
We extend recent work on the connection between loopy belief propagation and the Bethe free energy. Constrained minimization of the Bethe free energy can be turned into an unconstrained saddlepoint problem. Both converging doubleloop algorithms and standard loopy belief propagation can
Graph Zeta Function in the Bethe Free Energy and Loopy Belief Propagation
"... We propose a new approach to the analysis of Loopy Belief Propagation (LBP) by establishing a formula that connects the Hessian of the Bethe free energy with the edge zeta function. The formula has a number of theoretical implications on LBP. It is applied to give a sufficient condition that the Hes ..."
Abstract

Cited by 16 (3 self)
 Add to MetaCart
We propose a new approach to the analysis of Loopy Belief Propagation (LBP) by establishing a formula that connects the Hessian of the Bethe free energy with the edge zeta function. The formula has a number of theoretical implications on LBP. It is applied to give a sufficient condition
Properties of bethe free energies and message passing in Gaussian models
 Journal of Artificial Intelligence Research
"... We address the problem of computing approximate marginals in Gaussian probabilistic models by using mean field and fractional Bethe approximations. We define the Gaussian fractional Bethe free energy in terms of the moment parameters of the approximate marginals, derive a lower and an upper bound on ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
We address the problem of computing approximate marginals in Gaussian probabilistic models by using mean field and fractional Bethe approximations. We define the Gaussian fractional Bethe free energy in terms of the moment parameters of the approximate marginals, derive a lower and an upper bound
Bethe free energy, Kikuchi approximations and belief propagation algorithms
, 2000
"... Belief propagation (BP) was only supposed to work for treelike networks but works surprisingly well in many applications involving networks with loops, including turbo codes. However, there has been little understanding of the algorithm or the nature of the solutions it nds for general graphs. ..."
Abstract

Cited by 95 (2 self)
 Add to MetaCart
. We show that BP can only converge to a stationary point of an approximate free energy, known as the Bethe free energy in statistical physics. This result characterizes BP xedpoints and makes connections with variational approaches to approximate inference. More importantly, our analysis lets us
Bethe Free Energy and Contrastive Divergence Approximations for Undirected Graphical Models
, 2003
"... As the machine learning community tackles more complex and harder problems, the graphical models needed to solve such problems become larger and more complicated. As a result performing inference and learning exactly for such graphical models become ever more expensive, and approximate inference a ..."
Abstract
 Add to MetaCart
and learning techniques become ever more prominent. There are a variety of techniques for approximate inference and learning in the literature. This thesis contributes some new ideas in the products of experts (PoEs) class of models (Hinton, 2002), and the Bethe free energy approximations (Yedidia et al
The Bethe Free Energy Allows to Compute the Conditional Entropy of Graphical Code Instances. A Proof from the Polymer Expansion
, 2013
"... The main objective of this paper is to show that the Bethe free energy associated to a LowDensity ParityCheck code used over a Binary Symmetric Channel in a large noise regime is, with high probability, asymptotically exact as the block length grows. Using the loopsum as a starting point, we dev ..."
Abstract
 Add to MetaCart
The main objective of this paper is to show that the Bethe free energy associated to a LowDensity ParityCheck code used over a Binary Symmetric Channel in a large noise regime is, with high probability, asymptotically exact as the block length grows. Using the loopsum as a starting point, we
Bethe Free Energy and Contrastive Divergence Approximations for Undirected Graphical Models
, 2003
"... As the machine learning community tackles more complex and harder problems, the graphical models needed to solve such problems become larger and more complicated. As a result performing inference and learning exactly for such graphical models become ever more expensive, and approximate inference a ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
As the machine learning community tackles more complex and harder problems, the graphical models needed to solve such problems become larger and more complicated. As a result performing inference and learning exactly for such graphical models become ever more expensive, and approximate inference and learning techniques become ever more prominent. There are
Results 1  10
of
20,953