Results 1  10
of
18
Probability distributions with summary graph structure
, 2008
"... A joint density of many variables may satisfy a possibly large set of independence statements, called its independence structure. Often the structure of interest is representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities ..."
Abstract

Cited by 13 (2 self)
 Add to MetaCart
A joint density of many variables may satisfy a possibly large set of independence statements, called its independence structure. Often the structure of interest is representable by a graph that consists of nodes representing variables and of edges that couple node pairs. We consider joint densities of this type, generated by a stepwise process in which all variables and dependences of interest are included. Otherwise, there are no constraints on the type of variables or on the form of the generating conditional densities. For the joint density that then results after marginalising and conditioning, we derive what we name the summary graph. It is seen to capture precisely the independence structure implied by the generating process, it identifies dependences which remain undistorted due to direct or indirect confounding and it alerts to such, possibly severe distortions in other parametrizations. Summary graphs preserve their form after marginalising and conditioning and they include multivariate regression chain graphs as special cases. We use operators for matrix representations of graphs to derive matrix results and translate these into special types of path. 1. Introduction. Graphical Markov
Marginal loglinear parameters for graphical markov models. arXiv preprint arXiv:1105.6075, 2011. RA Fisher. On the interpretation of χ2 from contingency tables, and the calculation of p
"... Marginal loglinear (MLL) models provide a flexible approach to multivariate discrete data. MLL parametrizations under linear constraints induce a wide variety of models, including models defined by conditional independences. We introduce a subclass of MLL models which correspond to Acyclic Direct ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
Marginal loglinear (MLL) models provide a flexible approach to multivariate discrete data. MLL parametrizations under linear constraints induce a wide variety of models, including models defined by conditional independences. We introduce a subclass of MLL models which correspond to Acyclic Directed Mixed Graphs (ADMGs) under the usual global Markov property. We characterize for precisely which graphs the resulting parametrization is variation independent. The MLL approach provides the first description of ADMG models in terms of a minimal list of constraints. The parametrization is also easily adapted to sparse modelling techniques, which we illustrate using several examples of real data.
Sequences of regressions and their independences
 TEST
"... ABSTRACT: Ordered sequences of univariate or multivariate regressions provide statistical models for analysing data from randomized, possibly sequential interventions, from cohort or multiwave panel studies, but also from crosssectional or retrospective studies. Conditional independences are capt ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
(Show Context)
ABSTRACT: Ordered sequences of univariate or multivariate regressions provide statistical models for analysing data from randomized, possibly sequential interventions, from cohort or multiwave panel studies, but also from crosssectional or retrospective studies. Conditional independences are captured by what we name regression graphs, provided the generated distribution shares some properties with a joint Gaussian distribution. Regression graphs extend purely directed, acyclic graphs by two types of undirected graph, one type for components of joint responses and the other for components of the context vector variable. We review the special features and the history of regression graphs, derive criteria to read all implied independences of a regression graph and prove criteria for Markov equivalence that is to judge whether two different graphs imply the same set of independence statements. Knowledge of Markov equivalence provides alternative interpretations of a given sequence of regressions, is essential for machine learning strategies and permits to use the simple graphical criteria of regression graphs on graphs for which the corresponding criteria are in general more complex. Under the known conditions that a Markov equivalent directed acyclic graph exists for any given regression graph, we give a polynomial time algorithm to find one such graph.
Triangular systems for symmetric binary variables
 Electr. J. Statist
, 2009
"... Abstract We introduce and study distributions of sets of binary variables that are symmetric, that is each has equally probable levels. The joint distribution of these special types of binary variables, if generated by a recursive process of linear main effects is essentially parametrized in terms ..."
Abstract

Cited by 10 (8 self)
 Add to MetaCart
Abstract We introduce and study distributions of sets of binary variables that are symmetric, that is each has equally probable levels. The joint distribution of these special types of binary variables, if generated by a recursive process of linear main effects is essentially parametrized in terms of marginal correlations. This contrasts with the loglinear formulation of joint probabilities in which parameters measure conditional associations given all remaining variables. The new formulation permits useful comparisons of different types of graphical Markov models and leads to a close approximation of Gaussian orthant probabilities.
Traceable Regressions
 INTERNATIONAL STATISTICAL REVIEW (2012), 80, 3, 415–438
, 2012
"... In this paper, we define and study the concept of traceable regressions and apply it to some examples. Traceable regressions are sequences of conditional distributions in joint or single responses for which a corresponding graph captures an independence structure and represents, in addition, conditi ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
In this paper, we define and study the concept of traceable regressions and apply it to some examples. Traceable regressions are sequences of conditional distributions in joint or single responses for which a corresponding graph captures an independence structure and represents, in addition, conditional dependences that permit the tracing of pathways of dependence. We give the properties needed for transforming these graphs and graphical criteria to decide whether a path in the graph induces a dependence. The much stronger constraints on distributions that are faithful to a graph are compared to those needed for traceable regressions.
Logmean linear models for binary data
 Biometrika
, 2013
"... This paper is devoted to the theory and application of a novel class of models for binary data, which we call logmean linear (LML) models. The characterizing feature of these models is that they are specified by linear constraints on the LML parameter, defined as a loglinear expansion of the mean ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
This paper is devoted to the theory and application of a novel class of models for binary data, which we call logmean linear (LML) models. The characterizing feature of these models is that they are specified by linear constraints on the LML parameter, defined as a loglinear expansion of the mean parameter of the multivariate Bernoulli distribution. We show that marginal independence relationships between variables can be specified by setting certain LML interactions to zero and, more specifically, that graphical models of marginal independence are LML models. LML models are code dependent, in the sense that they are not invariant with respect to relabelling of variable values. As a consequence, they allow us to specify submodels defined by codespecific independencies, which are independencies in subpopulations of interest. This special feature of LML models has useful applications. Firstly, it provides a flexible way to specify parsimonious submodels of marginal independence models. The main advantage of this approach concerns the interpretation of the submodel, which is fully characterized by independence relationships, either marginal or codespecific. Secondly, the codespecific nature of these models can be exploited to focus on a fixed, arbitrary, cell of the probability table and on the corresponding subpopulation. This leads to an innovative family of models, which we call pivotal codespecific LML models, that is especially useful when the interest of researchers is focused on a small subpopulation obtained by stratifying individuals according to some features. The application of LML models is illustrated on two datasets, one of which concerns the use of pivotal codespecific LML models in the field of personalized medicine.
GRAPHICAL MARKOV MODELS
"... Graphical Markov models are multivariate statistical models which are currently under vigorous development and which combine two simple but most powerful notions, generating processes in single and joint response variables and conditional independences captured by graphs. The development of graphica ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Graphical Markov models are multivariate statistical models which are currently under vigorous development and which combine two simple but most powerful notions, generating processes in single and joint response variables and conditional independences captured by graphs. The development of graphical Markov started with work by Wermuth (1976, 1980) and Darroch, Lauritzen and Speed (1980) which built on early results in 1920 to 1930 by geneticist Sewall Wright and probabilist Andrej Markov as well as on results for loglinear models by Birch (1963), Goodman (1970), Bishop, Fienberg and Holland (1973) and for covariance selection by Dempster (1972). Wright used graphs, in which nodes represent variables and arrows indicate linear dependence, to describe hypotheses about stepwise processes in single responses that could have generated his data. He developed a method, called path analysis, to estimate linear dependences and to judge whether the hypotheses are well compatible with his data which he summarized in terms of simple and partial correlations. With this approach he was far ahead of his time, since corresponding
Graphical Markov models: overview
, 2015
"... AbstractWe describe how graphical Markov models emerged in the last 40 years, based on three essential concepts that had been developed independently more than a century ago. Sequences of joint or single regressions and their regression graphs are singled out as being the subclass that is best suite ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
AbstractWe describe how graphical Markov models emerged in the last 40 years, based on three essential concepts that had been developed independently more than a century ago. Sequences of joint or single regressions and their regression graphs are singled out as being the subclass that is best suited for analyzing longitudinal data and for tracing developmental pathways, both in observational and in intervention studies. Interpretations are illustrated using two sets of data. Furthermore, some of the more recent, important results for sequences of regressions are summarized. 1 Some general and historical remarks on the types of model Graphical models aim to describe in concise form the possibly complex interrelations between a set of variables so that key properties can be read directly o ↵ a graph. The central idea is that each variable is represented by a node in a graph. Any pair of nodes may become coupled, that is joined by an edge. Coupled nodes are also said to be adjacent. For many types of graph, a missing edge represents some form of conditional independence between the pair of variables and an edge present can be interpreted as a corresponding conditional dependence. Because the conditioning set may be empty,