Results 1 
9 of
9
Graphical Markov models: overview
, 2015
"... AbstractWe describe how graphical Markov models emerged in the last 40 years, based on three essential concepts that had been developed independently more than a century ago. Sequences of joint or single regressions and their regression graphs are singled out as being the subclass that is best suite ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
AbstractWe describe how graphical Markov models emerged in the last 40 years, based on three essential concepts that had been developed independently more than a century ago. Sequences of joint or single regressions and their regression graphs are singled out as being the subclass that is best suited for analyzing longitudinal data and for tracing developmental pathways, both in observational and in intervention studies. Interpretations are illustrated using two sets of data. Furthermore, some of the more recent, important results for sequences of regressions are summarized. 1 Some general and historical remarks on the types of model Graphical models aim to describe in concise form the possibly complex interrelations between a set of variables so that key properties can be read directly o ↵ a graph. The central idea is that each variable is represented by a node in a graph. Any pair of nodes may become coupled, that is joined by an edge. Coupled nodes are also said to be adjacent. For many types of graph, a missing edge represents some form of conditional independence between the pair of variables and an edge present can be interpreted as a corresponding conditional dependence. Because the conditioning set may be empty,
Identifying the irreducible disjoint factors of a multivariate probability distribution
"... Abstract We study the problem of decomposing a multivariate probability distribution p(v) defined over a set of random variables V = {V 1 , . . . , V n } into a product of factors defined over disjoint subsets {V F1 , . . . , V Fm }. We show that the decomposition of V into irreducible disjoint fac ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract We study the problem of decomposing a multivariate probability distribution p(v) defined over a set of random variables V = {V 1 , . . . , V n } into a product of factors defined over disjoint subsets {V F1 , . . . , V Fm }. We show that the decomposition of V into irreducible disjoint factors forms a unique partition, which corresponds to the connected components of a Bayesian or Markov network, given that it is faithful to p. Finally, we provide three generic procedures to identify these factors with O(n 2 ) pairwise conditional independence tests (V i ⊥ ⊥ V j Z) under much less restrictive assumptions: 1) p supports the Intersection property; ii) p supports the Composition property; iii) no assumption at all.
Segregated Graphs and Marginals of Chain Graph Models
"... Abstract Bayesian networks are a popular representation of asymmetric (for example causal) relationships between random variables. Markov random fields (MRFs) are a complementary model of symmetric relationships used in computer vision, spatial modeling, and social and gene expression networks. A c ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract Bayesian networks are a popular representation of asymmetric (for example causal) relationships between random variables. Markov random fields (MRFs) are a complementary model of symmetric relationships used in computer vision, spatial modeling, and social and gene expression networks. A chain graph model under the LauritzenWermuthFrydenberg interpretation (hereafter a chain graph model) generalizes both Bayesian networks and MRFs, and can represent asymmetric and symmetric relationships together. As in other graphical models, the set of marginals from distributions in a chain graph model induced by the presence of hidden variables forms a complex model. One recent approach to the study of marginal graphical models is to consider a wellbehaved supermodel. Such a supermodel of marginals of Bayesian networks, defined only by conditional independences, and termed the ordinary Markov model, was studied at length in
Alternative Markov and Causal Properties for Acyclic Directed Mixed Graphs
"... Abstract We extend AnderssonMadiganPerlman chain graphs by (i) relaxing the semidirected acyclity constraint so that only directed cycles are forbidden, and (ii) allowing up to two edges between any pair of nodes. We introduce global, and ordered local and pairwise Markov properties for the new m ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract We extend AnderssonMadiganPerlman chain graphs by (i) relaxing the semidirected acyclity constraint so that only directed cycles are forbidden, and (ii) allowing up to two edges between any pair of nodes. We introduce global, and ordered local and pairwise Markov properties for the new models. We show the equivalence of these properties for strictly positive probability distributions. We also show that when the random variables are continuous, the new models can be interpreted as systems of structural equations with correlated errors. This enables us to adapt Pearl's docalculus to them. Finally, we describe an exact algorithm for learning the new models from observational and interventional data via answer set programming.
Triangular systems for symmetric . . .
, 2009
"... We introduce and study distributions of sets of binary variables that are symmetric, that is each has equally probable levels. The joint distribution of these special types of binary variables, if generated by a recursive process of linear main effects is essentially parametrized in terms of margi ..."
Abstract
 Add to MetaCart
We introduce and study distributions of sets of binary variables that are symmetric, that is each has equally probable levels. The joint distribution of these special types of binary variables, if generated by a recursive process of linear main effects is essentially parametrized in terms of marginal correlations. This contrasts with the loglinear formulation of joint probabilities in which parameters measure conditional associations given all remaining variables. The new formulation permits useful comparisons of different types of graphical Markov models and leads to a close approximation of Gaussian orthant probabilities.