Results 1 
5 of
5
Markov properties for mixed graphs
, 2014
"... In this paper, we unify the Markov theory of a variety of different types of graphs used in graphical Markov models by introducing the class of loopless mixed graphs, and show that all independence models induced by mseparation on such graphs are compositional graphoids. We focus in particular on t ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
In this paper, we unify the Markov theory of a variety of different types of graphs used in graphical Markov models by introducing the class of loopless mixed graphs, and show that all independence models induced by mseparation on such graphs are compositional graphoids. We focus in particular on the subclass of ribbonless graphs which as special cases include undirected graphs, bidirected graphs, and directed acyclic graphs, as well as ancestral graphs and summary graphs. We define maximality of such graphs as well as a pairwise and a global Markov property. We prove that the global and pairwise Markov properties of a maximal ribbonless graph are equivalent for any independence model that is a compositional graphoid.
Graphical Markov models: overview
, 2015
"... AbstractWe describe how graphical Markov models emerged in the last 40 years, based on three essential concepts that had been developed independently more than a century ago. Sequences of joint or single regressions and their regression graphs are singled out as being the subclass that is best suite ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
AbstractWe describe how graphical Markov models emerged in the last 40 years, based on three essential concepts that had been developed independently more than a century ago. Sequences of joint or single regressions and their regression graphs are singled out as being the subclass that is best suited for analyzing longitudinal data and for tracing developmental pathways, both in observational and in intervention studies. Interpretations are illustrated using two sets of data. Furthermore, some of the more recent, important results for sequences of regressions are summarized. 1 Some general and historical remarks on the types of model Graphical models aim to describe in concise form the possibly complex interrelations between a set of variables so that key properties can be read directly o ↵ a graph. The central idea is that each variable is represented by a node in a graph. Any pair of nodes may become coupled, that is joined by an edge. Coupled nodes are also said to be adjacent. For many types of graph, a missing edge represents some form of conditional independence between the pair of variables and an edge present can be interpreted as a corresponding conditional dependence. Because the conditioning set may be empty,
Error AMP Chain Graphs
, 2013
"... Any regular Gaussian probability distribution that can be represented by an AMP chain graph (CG) can be expressed as a system of linear equations with correlated errors whose structure depends on the CG. However, the CG represents the errors implicitly, as no nodes in the CG correspond to the erro ..."
Abstract
 Add to MetaCart
(Show Context)
Any regular Gaussian probability distribution that can be represented by an AMP chain graph (CG) can be expressed as a system of linear equations with correlated errors whose structure depends on the CG. However, the CG represents the errors implicitly, as no nodes in the CG correspond to the errors. We propose in this paper to add some deterministic nodes to the CG in order to represent the errors explicitly. We call the result an EAMP CG. We will show that, as desired, every AMP CG is Markov equivalent to its corresponding EAMP CG under marginalization of the error nodes. We will also show that every EAMP CG under marginalization of the error nodes is Markov equivalent to some LWF CG under marginalization of the error nodes, and that the latter is Markov equivalent to some directed and acyclic graph (DAG) under marginalization of the error nodes and conditioning on some selection nodes. This is important because it implies that the independence model represented by an AMP CG can be accounted for by some data generating process that is partially observed and has selection bias. Finally, we will show that EAMP CGs are closed under marginalization. This is a desirable feature because it guarantees parsimonious models under marginalization.