Results 1  10
of
37
Chain graph models of multivariate regression
, 906
"... type for categorical data ..."
(Show Context)
Probability distributions with summary graph structure
, 2008
"... A joint density of many variables may satisfy a possibly large set of independence statements, called its independence structure. Often the structure of interest is representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
A joint density of many variables may satisfy a possibly large set of independence statements, called its independence structure. Often the structure of interest is representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities of this type, generated by a stepwise process in which all variables and dependences of interest are included. Otherwise, there are no constraints on the type of variables or on the form of the generating conditional densities. For the joint density that then results after marginalising and conditioning, we derive what we name the summary graph. It is seen to capture precisely the independence structure implied by the generating process, it identifies dependences which remain undistorted due to direct or indirect confounding and it alerts to such, possibly severe distortions in other parametrizations. Summary graphs preserve their form after marginalising and conditioning and they include multivariate regression chain graphs as special cases. We use operators for matrix representations of graphs to derive matrix results and translate these into special types of path. 1. Introduction. Graphical Markov
Marginal loglinear parameters for graphical markov models. arXiv preprint arXiv:1105.6075, 2011. RA Fisher. On the interpretation of χ2 from contingency tables, and the calculation of p
"... Marginal loglinear (MLL) models provide a flexible approach to multivariate discrete data. MLL parametrizations under linear constraints induce a wide variety of models, including models defined by conditional independences. We introduce a subclass of MLL models which correspond to Acyclic Direct ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
Marginal loglinear (MLL) models provide a flexible approach to multivariate discrete data. MLL parametrizations under linear constraints induce a wide variety of models, including models defined by conditional independences. We introduce a subclass of MLL models which correspond to Acyclic Directed Mixed Graphs (ADMGs) under the usual global Markov property. We characterize for precisely which graphs the resulting parametrization is variation independent. The MLL approach provides the first description of ADMG models in terms of a minimal list of constraints. The parametrization is also easily adapted to sparse modelling techniques, which we illustrate using several examples of real data.
Sequences of regressions and their independences
 TEST
"... ABSTRACT: Ordered sequences of univariate or multivariate regressions provide statistical models for analysing data from randomized, possibly sequential interventions, from cohort or multiwave panel studies, but also from crosssectional or retrospective studies. Conditional independences are capt ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
(Show Context)
ABSTRACT: Ordered sequences of univariate or multivariate regressions provide statistical models for analysing data from randomized, possibly sequential interventions, from cohort or multiwave panel studies, but also from crosssectional or retrospective studies. Conditional independences are captured by what we name regression graphs, provided the generated distribution shares some properties with a joint Gaussian distribution. Regression graphs extend purely directed, acyclic graphs by two types of undirected graph, one type for components of joint responses and the other for components of the context vector variable. We review the special features and the history of regression graphs, derive criteria to read all implied independences of a regression graph and prove criteria for Markov equivalence that is to judge whether two different graphs imply the same set of independence statements. Knowledge of Markov equivalence provides alternative interpretations of a given sequence of regressions, is essential for machine learning strategies and permits to use the simple graphical criteria of regression graphs on graphs for which the corresponding criteria are in general more complex. Under the known conditions that a Markov equivalent directed acyclic graph exists for any given regression graph, we give a polynomial time algorithm to find one such graph.
A factorization criterion for acyclic directed mixed graphs
 In UAI09
, 2009
"... Acyclic directed mixed graphs, also known as semiMarkov models represent the conditional independence structure induced on an observed margin by a DAG model with latent variables. In this paper we present a factorization criterion for these models that is equivalent to the global Markov property gi ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Show Context)
Acyclic directed mixed graphs, also known as semiMarkov models represent the conditional independence structure induced on an observed margin by a DAG model with latent variables. In this paper we present a factorization criterion for these models that is equivalent to the global Markov property given by (the natural extension of) dseparation. 1
Markov properties for mixed graphs
, 2014
"... In this paper, we unify the Markov theory of a variety of different types of graphs used in graphical Markov models by introducing the class of loopless mixed graphs, and show that all independence models induced by mseparation on such graphs are compositional graphoids. We focus in particular on t ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
In this paper, we unify the Markov theory of a variety of different types of graphs used in graphical Markov models by introducing the class of loopless mixed graphs, and show that all independence models induced by mseparation on such graphs are compositional graphoids. We focus in particular on the subclass of ribbonless graphs which as special cases include undirected graphs, bidirected graphs, and directed acyclic graphs, as well as ancestral graphs and summary graphs. We define maximality of such graphs as well as a pairwise and a global Markov property. We prove that the global and pairwise Markov properties of a maximal ribbonless graph are equivalent for any independence model that is a compositional graphoid.
Traceable Regressions
 INTERNATIONAL STATISTICAL REVIEW (2012), 80, 3, 415–438
, 2012
"... In this paper, we define and study the concept of traceable regressions and apply it to some examples. Traceable regressions are sequences of conditional distributions in joint or single responses for which a corresponding graph captures an independence structure and represents, in addition, conditi ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
In this paper, we define and study the concept of traceable regressions and apply it to some examples. Traceable regressions are sequences of conditional distributions in joint or single responses for which a corresponding graph captures an independence structure and represents, in addition, conditional dependences that permit the tracing of pathways of dependence. We give the properties needed for transforming these graphs and graphical criteria to decide whether a path in the graph induces a dependence. The much stronger constraints on distributions that are faithful to a graph are compared to those needed for traceable regressions.
An Inclusion Optimal Algorithm for Chain Graph Structure Learning
 In Proceedings of the 17th International Conference on Artificial Intelligence and Statistics
, 2014
"... This paper presents and proves an extension of Meek’s conjecture to chain graphs under the LauritzenWermuthFrydenberg interpretation. The proof of the conjecture leads to the development of a structure learning algorithm that finds an inclusion optimal chain graph for any given probability distr ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
(Show Context)
This paper presents and proves an extension of Meek’s conjecture to chain graphs under the LauritzenWermuthFrydenberg interpretation. The proof of the conjecture leads to the development of a structure learning algorithm that finds an inclusion optimal chain graph for any given probability distribution satisfying the composition property. Finally, the new algorithm is experimentally evaluated. 1
Mixed Cumulative Distribution Networks
"... Directed acyclic graphs (DAGs) are a popular framework to express multivariate probability distributions. Acyclic directed mixed graphs (ADMGs) are generalizations of DAGs that can succinctly capture much richer sets of conditional independencies, and are especially useful in modeling the effects of ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
Directed acyclic graphs (DAGs) are a popular framework to express multivariate probability distributions. Acyclic directed mixed graphs (ADMGs) are generalizations of DAGs that can succinctly capture much richer sets of conditional independencies, and are especially useful in modeling the effects of latent variables implicitly. Unfortunately, there are currently no parameterizations of general ADMGs. In this paper, we apply recent work on cumulative distribution networks and copulas to propose one general construction for ADMG models. We consider a simple parameter estimation approach, and report some encouraging experimental results. MGs are. Reading off independence constraints from a ADMG can be done with a procedure essentially identical to dseparation (Pearl, 1988, Richardson and Spirtes, 2002). Given a graphical structure, the challenge is to provide a procedure to parameterize models that correspond to the independence constraints of the graph, as illustrated below. Example 1: Bidirected edges correspond to some hidden common parent that has been marginalized. In the Gaussian case, this has an easy interpretation as constraints in the marginal covariance matrix of the remaining variables. Consider the two graphs below.
Chain Graph Interpretations and their Relations
"... Abstract. This paper deals with different chain graph interpretations and the relations between them in terms of representable independence models. Specifically, we study the LauritzenWermuthFrydenberg, AnderssonMadiganPearlman and multivariate regression interpretations and present the necessar ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
Abstract. This paper deals with different chain graph interpretations and the relations between them in terms of representable independence models. Specifically, we study the LauritzenWermuthFrydenberg, AnderssonMadiganPearlman and multivariate regression interpretations and present the necessary and sufficient conditions for when a chain graph of one interpretation can be perfectly translated into a chain graph of another interpretation. Moreover, we also present a feasible split for the AnderssonMadiganPearlman interpretation with similar features as the feasible splits presented for the other two interpretations.