Results 1  10
of
17
Structured Learning of Gaussian Graphical Models
"... We consider estimation of multiple highdimensional Gaussian graphical models corresponding to a single set of nodes under several distinct conditions. We assume that most aspects of the networks are shared, but that there are some structured differences between them. Specifically, the network diffe ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
to the aberrant activity of a few specific genes. We propose to solve this problem using the perturbednode joint graphical lasso, a convex optimization problem that is based upon the use of a rowcolumn overlap norm penalty. We then solve the convex problem using an alternating directions method of multipliers
Acknowledgement The work presented in this talk is joint with
, 2013
"... graphical lasso algorithms, ” to appear IEEE Trans on SP in ..."
Joint Structural Estimation of Multiple Graphical Models
, 2016
"... Abstract Gaussian graphical models capture dependence relationships between random variables through the pattern of nonzero elements in the corresponding inverse covariance matrices. To date, there has been a large body of literature on both computational methods and analytical results on the estim ..."
Abstract
 Add to MetaCart
settings different relationships between subsets of the node sets exist between different graphical models. We develop methodology that jointly estimates multiple Gaussian graphical models, assuming that there exists prior information on how they are structurally related. For many applications
Blossom Tree Graphical Models
"... We combine the ideas behind trees and Gaussian graphical models to form a new nonparametric family of graphical models. Our approach is to attach nonparanormal “blossoms”, with arbitrary graphs, to a collection of nonparametric trees. The tree edges are chosen to connect variables that most violate ..."
Abstract
 Add to MetaCart
joint Gaussianity. The nontree edges are partitioned into disjoint groups, and assigned to tree nodes using a nonparametric partial correlation statistic. A nonparanormal blossom is then “grown” for each group using established methods based on the graphical lasso. The result is a factorization
Supplement to “Regularized rankbased estimation of highdimensional nonparanormal graphical models.”
, 2012
"... A sparse precision matrix can be directly translated into a sparse Gaussian graphical model under the assumption that the data follow a joint normal distribution. This neat property makes highdimensional precision matrix estimation very appealing in many applications. However, in practice we often ..."
Abstract

Cited by 32 (5 self)
 Add to MetaCart
A sparse precision matrix can be directly translated into a sparse Gaussian graphical model under the assumption that the data follow a joint normal distribution. This neat property makes highdimensional precision matrix estimation very appealing in many applications. However, in practice we often
Published In TimeVarying Gaussian Graphical Models of Molecular Dynamics Data
, 2010
"... We introduce an algorithm for learning sparse, timevarying undirected probabilistic graphical models of Molecular Dynamics (MD) data. Our method computes a maximum a posteriori (MAP) estimate of the topology and parameters of the model (i.e., structure learning) using L1regularization of the negat ..."
Abstract
 Add to MetaCart
of the negative loglikelihood (aka ‘Graphical Lasso’) to ensure sparsity, and a kernel to ensure smoothly varying topology and parameters over time. The learning problem is posed as a convex optimization problem and then solved optimally using block coordinate descent. The resulting model encodes the time
NodeStructured Integrative Gaussian Graphical Model Guided by Pathway Information
"... Up to date, many biological pathways related to cancer have been extensively applied thanks to outputs of burgeoning biomedical research. This leads to a new technical challenge of exploring and validating biological pathways that can characterize transcriptomic mechanisms across different disease ..."
Abstract
 Add to MetaCart
subtypes. In pursuit of accommodating multiple studies, the joint Gaussian graphical model was previously proposed to incorporate nonzero edge effects. However, this model is inevitably dependent on post hoc analysis in order to confirm biological significance. To circumvent this drawback, we attempt
TimeVarying Gaussian Graphical Models of Molecular Dynamics Data
, 2010
"... We introduce an algorithm for learning sparse, timevarying undirected probabilistic graphical models of Molecular Dynamics (MD) data. Our method computes a maximum a posteriori (MAP) estimate of the topology and parameters of the model (i.e., structure learning) using L1regularization of the negati ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
of the negative loglikelihood (aka ‘Graphical Lasso’) to ensure sparsity, and a kernel to ensure smoothly varying topology and parameters over time. The learning problem is posed as a convex optimization problem and then solved optimally using block coordinate descent. The resulting model encodes the time
Efficient inference in matrixvariate Gaussian models with iid observation noise
"... Inference in matrixvariate Gaussian models has major applications for multioutput prediction and joint learning of row and column covariances from matrixvariate data. Here, we discuss an approach for efficient inference in such models that explicitly account for iid observation noise. Computational ..."
Abstract

Cited by 15 (2 self)
 Add to MetaCart
. Computational tractability can be retained by exploiting the Kronecker product between row and column covariance matrices. Using this framework, we show how to generalize the Graphical Lasso in order to learn a sparse inverse covariance between features while accounting for a lowrank confounding covariance
Network inference in matrixvariate Gaussian models with nonindependent noise
, 2013
"... Inferring a graphical model or network from observational data from a large number of variables is a well studied problem in machine learning and computational statistics. In this paper we consider a version of this problem that is relevant to the analysis of multiple phenotypes collected in geneti ..."
Abstract
 Add to MetaCart
in genetic studies. In such datasets we expect correlations between phenotypes and between individuals. We model observations as a sum of two matrix normal variates such that the joint covariance function is a sum of Kronecker products. This model, which generalizes the Graphical Lasso, assumes observations
Results 1  10
of
17