• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 17
Next 10 →

Structured Learning of Gaussian Graphical Models

by Karthik Mohan, Michael Jae-yoon Chung, Seungyeop Han, Daniela Witten, Su-in Lee, Maryam Fazel
"... We consider estimation of multiple high-dimensional Gaussian graphical models corresponding to a single set of nodes under several distinct conditions. We assume that most aspects of the networks are shared, but that there are some structured differences between them. Specifically, the network diffe ..."
Abstract - Cited by 8 (1 self) - Add to MetaCart
to the aberrant activity of a few specific genes. We propose to solve this problem using the perturbed-node joint graphical lasso, a convex optimization problem that is based upon the use of a row-column overlap norm penalty. We then solve the convex problem using an alternating directions method of multipliers

Acknowledgement The work presented in this talk is joint with

by Alfred O. Hero, Theodoros Tsiligkarides, Um Eecs , 2013
"... graphical lasso algorithms, ” to appear IEEE Trans on SP in ..."
Abstract - Add to MetaCart
graphical lasso algorithms, ” to appear IEEE Trans on SP in

Joint Structural Estimation of Multiple Graphical Models

by Jing Ma , Jinma@upenn Edu , George Michailidis , 2016
"... Abstract Gaussian graphical models capture dependence relationships between random variables through the pattern of nonzero elements in the corresponding inverse covariance matrices. To date, there has been a large body of literature on both computational methods and analytical results on the estim ..."
Abstract - Add to MetaCart
settings different relationships between subsets of the node sets exist between different graphical models. We develop methodology that jointly estimates multiple Gaussian graphical models, assuming that there exists prior information on how they are structurally related. For many applications

Blossom Tree Graphical Models

by Zhe Liu, John Lafferty
"... We combine the ideas behind trees and Gaussian graphical models to form a new nonparametric family of graphical models. Our approach is to attach nonparanormal “blossoms”, with arbitrary graphs, to a collection of nonparametric trees. The tree edges are chosen to connect variables that most violate ..."
Abstract - Add to MetaCart
joint Gaussianity. The non-tree edges are partitioned into disjoint groups, and assigned to tree nodes using a nonparametric partial correlation statistic. A nonparanormal blossom is then “grown” for each group using established methods based on the graphical lasso. The result is a factorization

Supplement to “Regularized rank-based estimation of highdimensional nonparanormal graphical models.”

by Lingzhou Xue, Hui Zou , 2012
"... A sparse precision matrix can be directly translated into a sparse Gaussian graphical model under the assumption that the data follow a joint normal distribution. This neat property makes high-dimensional precision matrix estimation very appealing in many applications. However, in practice we often ..."
Abstract - Cited by 32 (5 self) - Add to MetaCart
A sparse precision matrix can be directly translated into a sparse Gaussian graphical model under the assumption that the data follow a joint normal distribution. This neat property makes high-dimensional precision matrix estimation very appealing in many applications. However, in practice we often

Published In Time-Varying Gaussian Graphical Models of Molecular Dynamics Data

by Narges Sharif Razavian, Subhodeep Moitra, Hetunandan Kamisetty, Arvind Ramanathan, Christopher J. Langmead, Narges Sharif Razavian, Subhodeep Moitra, An Kamisetty, Arvind Ramanathan, Christopher James Langmead , 2010
"... We introduce an algorithm for learning sparse, time-varying undirected probabilistic graphical models of Molecular Dynamics (MD) data. Our method computes a maximum a posteriori (MAP) estimate of the topology and parameters of the model (i.e., structure learning) using L1-regularization of the negat ..."
Abstract - Add to MetaCart
of the negative log-likelihood (aka ‘Graphical Lasso’) to ensure sparsity, and a kernel to ensure smoothly varying topology and parameters over time. The learning problem is posed as a convex optimization problem and then solved optimally using block coordinate descent. The resulting model encodes the time

Node-Structured Integrative Gaussian Graphical Model Guided by Pathway Information

by Sunghwan Kim , Jae-Hwan Jhong , Jungjun Lee , Ja-Yong Koo , Byungyong Lee , Sungwon Han
"... Up to date, many biological pathways related to cancer have been extensively applied thanks to outputs of burgeoning biomedical research. This leads to a new technical challenge of exploring and validating biological pathways that can characterize transcriptomic mechanisms across different disease ..."
Abstract - Add to MetaCart
subtypes. In pursuit of accommodating multiple studies, the joint Gaussian graphical model was previously proposed to incorporate nonzero edge effects. However, this model is inevitably dependent on post hoc analysis in order to confirm biological significance. To circumvent this drawback, we attempt

Time-Varying Gaussian Graphical Models of Molecular Dynamics Data

by Narges Sharif Razavian, Subhodeep Moitra, An Kamisetty, Arvind Ramanathan, Christopher James Langmead , 2010
"... We introduce an algorithm for learning sparse, time-varying undirected probabilistic graphical models of Molecular Dynamics (MD) data. Our method computes a maximum a posteriori (MAP) estimate of the topology and parameters of the model (i.e., structure learning) using L1regularization of the negati ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
of the negative log-likelihood (aka ‘Graphical Lasso’) to ensure sparsity, and a kernel to ensure smoothly varying topology and parameters over time. The learning problem is posed as a convex optimization problem and then solved optimally using block coordinate descent. The resulting model encodes the time

Efficient inference in matrix-variate Gaussian models with iid observation noise

by Oliver Stegle, Joris Mooij, Neil Lawrence, Karsten Borgwardt, Eberhard Karls Universität
"... Inference in matrix-variate Gaussian models has major applications for multioutput prediction and joint learning of row and column covariances from matrixvariate data. Here, we discuss an approach for efficient inference in such models that explicitly account for iid observation noise. Computational ..."
Abstract - Cited by 15 (2 self) - Add to MetaCart
. Computational tractability can be retained by exploiting the Kronecker product between row and column covariance matrices. Using this framework, we show how to generalize the Graphical Lasso in order to learn a sparse inverse covariance between features while accounting for a low-rank confounding covariance

Network inference in matrix-variate Gaussian models with non-independent noise

by Andy Dahl, Victoria Hore, Valentina Iotchkova , 2013
"... Inferring a graphical model or network from observational data from a large number of variables is a well studied problem in machine learning and computa-tional statistics. In this paper we consider a version of this problem that is relevant to the analysis of multiple phenotypes collected in geneti ..."
Abstract - Add to MetaCart
in genetic studies. In such datasets we expect correlations between phenotypes and between individuals. We model observations as a sum of two matrix normal variates such that the joint covariance function is a sum of Kronecker products. This model, which generalizes the Graph-ical Lasso, assumes observations
Next 10 →
Results 1 - 10 of 17
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University