Results 1  10
of
28
Lagrangian relaxation for MAP estimation in graphical models
 IN: 45TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL AND COMPUTING
, 2007
"... We develop a general framework for MAP estimation in discrete and Gaussian graphical models using Lagrangian relaxation techniques. The key idea is to reformulate an intractable estimation problem as one defined on a more tractable graph, but subject to additional constraints. Relaxing these const ..."
Abstract

Cited by 41 (3 self)
 Add to MetaCart
(Show Context)
We develop a general framework for MAP estimation in discrete and Gaussian graphical models using Lagrangian relaxation techniques. The key idea is to reformulate an intractable estimation problem as one defined on a more tractable graph, but subject to additional constraints. Relaxing these constraints gives a tractable dual problem, one defined by a thin graph, which is then optimized by an iterative procedure. When this iterative optimization leads to a consistent estimate, one which also satisfies the constraints, then it corresponds to an optimal MAP estimate of the original model. Otherwise there is a “duality gap”, and we obtain a bound on the optimal solution. Thus, our approach combines convex optimization with dynamic programming techniques applicable for thin graphs. The popular treereweighted maxproduct (TRMP) method may be seen as solving a particular class of such relaxations, where the intractable graph is relaxed to a set of spanning trees. We also consider relaxations to a set of small induced subgraphs, thin subgraphs (e.g. loops), and a connected tree obtained by “unwinding” cycles. In addition, we propose a new class of multiscale relaxations that introduce “summary” variables. The potential benefits of such generalizations include: reducing or eliminating the “duality gap” in hard problems, reducing the number or Lagrange multipliers in the dual problem, and accelerating convergence of the iterative optimization procedure.
HighDimensional Gaussian Graphical Model Selection: Walk Summability AND Local Separation Criterion
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2012
"... We consider the problem of highdimensional Gaussian graphical model selection. We identify a set ofgraphsforwhich anefficient estimation algorithmexists, and this algorithm is based on thresholding of empirical conditional covariances. Under a set of transparent conditions, we establish structuralc ..."
Abstract

Cited by 14 (4 self)
 Add to MetaCart
We consider the problem of highdimensional Gaussian graphical model selection. We identify a set ofgraphsforwhich anefficient estimation algorithmexists, and this algorithm is based on thresholding of empirical conditional covariances. Under a set of transparent conditions, we establish structuralconsistency (orsparsistency) forthe proposedalgorithm, when the number of samples n = Ω(J −2 minlogp), where p is the number of variables and Jmin is the minimum (absolute) edge potential of the graphical model. The sufficient conditions for sparsistency are based on the notion of walksummability of the model and the presence of sparse local vertex separators in the underlying graph. We also derive novel nonasymptotic necessary conditions on the number of samples required for sparsistency.
Feedback message passing for inference in Gaussian graphical models
 in Proc. IEEE Int. Symp. Inf. Theory (ISIT
, 2010
"... Abstract—While loopy belief propagation (LBP) performs reasonably well for inference in some Gaussian graphical models with cycles, its performance is unsatisfactory for many others. In particular for some models LBP does not converge, and in general when it does converge, the computed variances are ..."
Abstract

Cited by 14 (5 self)
 Add to MetaCart
(Show Context)
Abstract—While loopy belief propagation (LBP) performs reasonably well for inference in some Gaussian graphical models with cycles, its performance is unsatisfactory for many others. In particular for some models LBP does not converge, and in general when it does converge, the computed variances are incorrect (except for cyclefree graphs for which belief propagation (BP) is noniterative and exact). In this paper we propose feedback message passing (FMP), a messagepassing algorithm that makes use of a special set of vertices (called a feedback vertex set or FVS) whose removal results in a cyclefree graph. In FMP, standard BP is employed several times on the cyclefree subgraph excluding the FVS while a special messagepassing scheme is used for the nodes in the FVS. The computational complexity of exact inference is,whereis the number of feedback nodes, and is the total number of nodes. When the size of the FVS is very large, FMP is computationally costly. Hence we propose approximate
Lowrank variance approximation in GMRF models: Single and multiscale approaches
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 2008
"... We present a versatile framework for tractable computation of approximate variances in largescale Gaussian Markov random field estimation problems. In addition to its efficiency and simplicity, it also provides accuracy guarantees. Our approach relies on the construction of a certain lowrank alia ..."
Abstract

Cited by 13 (1 self)
 Add to MetaCart
(Show Context)
We present a versatile framework for tractable computation of approximate variances in largescale Gaussian Markov random field estimation problems. In addition to its efficiency and simplicity, it also provides accuracy guarantees. Our approach relies on the construction of a certain lowrank aliasing matrix with respect to the Markov graph of the model. We first construct this matrix for singlescale models with shortrange correlations and then introduce spliced wavelets and propose a construction for the longrange correlation case, and also for multiscale models. We describe the accuracy guarantees that the approach provides and apply the method to a large interpolation problem from oceanography with sparse, irregular, and noisy measurements, and to a gravity inversion problem.
Covariance estimation in decomposable Gaussian graphical models
 IEEE TRANSACTIONS ON SIGNAL PROCESSING
, 2010
"... Graphical models are a framework for representing and exploiting prior conditional independence structures within distributions using graphs. In the Gaussian case, these models are directly related to the sparsity of the inverse covariance (concentration) matrix and allow for improved covariance es ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
(Show Context)
Graphical models are a framework for representing and exploiting prior conditional independence structures within distributions using graphs. In the Gaussian case, these models are directly related to the sparsity of the inverse covariance (concentration) matrix and allow for improved covariance estimation with lower computational complexity. We consider concentration estimation with the meansquared error (MSE) as the objective, in a special type of model known as decomposable. This model includes, for example, the well known banded structure and other cases encountered in practice. Our first contribution is the derivation and analysis of the minimum variance unbiased estimator (MVUE) in decomposable graphical models. We provide a simple closed form solution to the MVUE and compare it with the classical maximum likelihood estimator (MLE) in terms of performance and complexity. Next, we extend the celebrated Stein’s unbiased risk estimate (SURE) to graphical models. Using SURE, we prove that the MSE of the MVUE is always smaller or equal to that of the biased MLE, and that the MVUE itself is dominated by other approaches. In addition, we propose the use of SURE as a constructive mechanism for deriving new covariance estimators. Similarly to the classical MLE, all of our proposed estimators have simple closed form solutions but result in a significant reduction in MSE.
Transelliptical graphical models
 In Advances in Neural Information Processing Systems
, 2012
"... We advocate the use of a new distribution family—the transelliptical—for robust inference of high dimensional graphical models. The transelliptical family is an extension of the nonparanormal family proposed by Liu et al. (2009). Just as the nonparanormal extends the normal by transforming the varia ..."
Abstract

Cited by 11 (7 self)
 Add to MetaCart
(Show Context)
We advocate the use of a new distribution family—the transelliptical—for robust inference of high dimensional graphical models. The transelliptical family is an extension of the nonparanormal family proposed by Liu et al. (2009). Just as the nonparanormal extends the normal by transforming the variables using univariate functions, the transelliptical extends the elliptical family in the same way. We propose a nonparametric rankbased regularization estimator which achieves the parametric rates of convergence for both graph recovery and parameter estimation. Such a result suggests that the extra robustness and flexibility obtained by the semiparametric transelliptical modeling incurs almost no efficiency loss. We also discuss the relationship between this work with the transelliptical component analysis proposed by Han and Liu (2012). 1
Multiscale Gaussian graphical models and algorithms for largescale inference
, 2007
"... We propose a class of multiscale graphical models and algorithms to estimate means and approximate error variances of largescale Gaussian processes efficiently. Based on emerging techniques for inference on Gaussian graphical models with cycles, we extend traditional multiscale tree models to pyram ..."
Abstract

Cited by 9 (5 self)
 Add to MetaCart
(Show Context)
We propose a class of multiscale graphical models and algorithms to estimate means and approximate error variances of largescale Gaussian processes efficiently. Based on emerging techniques for inference on Gaussian graphical models with cycles, we extend traditional multiscale tree models to pyramidal graphs, which incorporate both inter and intra scale interactions. In the spirit of multipole algorithms, we develop efficient inference methods in which variables farapart communicate through coarser resolutions and nearby variables interact at finer resolutions. In addition, we propose methods to update the estimates rapidly when measurements are added or new knowledge of a local region is provided. Index Terms — graphical models, GaussMarkov random fields, multiresolution, multiscale, largescale estimation problems 1.
Distributed Covariance Estimation in Gaussian Graphical Models
"... Abstract—We consider distributed estimation of the inverse covariance matrix in Gaussian graphical models. These models factorize the multivariate distribution and allow for efficient distributed signal processing methods such as belief propagation (BP). The classical maximum likelihood approach to ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
Abstract—We consider distributed estimation of the inverse covariance matrix in Gaussian graphical models. These models factorize the multivariate distribution and allow for efficient distributed signal processing methods such as belief propagation (BP). The classical maximum likelihood approach to this covariance estimation problem, or potential function estimation in BP terminology, requires centralized computing and is computationally intensive. This motivates suboptimal distributed alternatives that tradeoff accuracy for communication cost. A natural solution is for each node to perform estimation of its local covariance with respect to its neighbors. The local maximum likelihood estimator is asymptotically consistent but suboptimal, i.e., it does not minimize mean squared estimation (MSE) error. We propose to improve the MSE performance by introducing additional symmetry constraints using averaging and pseudolikelihood estimation approaches. We compute the proposed estimates using message passing protocols, which can be efficiently implemented in large scale graphical models with many nodes. We illustrate the advantages of our proposed methods using numerical experiments with synthetic data as well as real world data from a wireless sensor network. Index Terms—Covariance estimation, distributed signal processing, graphical models. I.
Gaussian Multiresolution Models: Exploiting Sparse Markov and Covariance Structure
 IEEE Trans. on Signal. Proc
, 2010
"... Abstract—In this paper, we consider the problem of learning Gaussian multiresolution (MR) models in which data are only available at the finest scale, and the coarser, hidden variables serve to capture longdistance dependencies. Treestructured MR models have limited modeling capabilities, as varia ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
Abstract—In this paper, we consider the problem of learning Gaussian multiresolution (MR) models in which data are only available at the finest scale, and the coarser, hidden variables serve to capture longdistance dependencies. Treestructured MR models have limited modeling capabilities, as variables at one scale are forced to be uncorrelated with each other conditioned on other scales. We propose a new class of Gaussian MR models in which variables at each scale have sparse conditional covariance structure conditioned on other scales. Our goal is to learn a treestructured graphical model connecting variables across scales (which translates into sparsity in inverse covariance), while at the same time learning sparse structure for the conditional covariance (not its inverse) within each scale conditioned on other scales. This model leads to an efficient, new inference algorithm that is similar to multipole methods in computational physics. We demonstrate the modeling and inference advantages of our approach over methods that use MR tree models and singlescale approximation methods that do not use hidden variables. Index Terms—Gauss–Markov random fields, graphical models, hidden variables, multipole methods, multiresolution (MR) models. I.