Results 1  10
of
13
Global identifiability of linear structural equation models
 Ann. Statist
, 2011
"... ar ..."
(Show Context)
Marginal loglinear parameters for graphical markov models. arXiv preprint arXiv:1105.6075, 2011. RA Fisher. On the interpretation of χ2 from contingency tables, and the calculation of p
"... Marginal loglinear (MLL) models provide a flexible approach to multivariate discrete data. MLL parametrizations under linear constraints induce a wide variety of models, including models defined by conditional independences. We introduce a subclass of MLL models which correspond to Acyclic Direct ..."
Abstract

Cited by 11 (3 self)
 Add to MetaCart
(Show Context)
Marginal loglinear (MLL) models provide a flexible approach to multivariate discrete data. MLL parametrizations under linear constraints induce a wide variety of models, including models defined by conditional independences. We introduce a subclass of MLL models which correspond to Acyclic Directed Mixed Graphs (ADMGs) under the usual global Markov property. We characterize for precisely which graphs the resulting parametrization is variation independent. The MLL approach provides the first description of ADMG models in terms of a minimal list of constraints. The parametrization is also easily adapted to sparse modelling techniques, which we illustrate using several examples of real data.
Markov properties for mixed graphs
, 2014
"... In this paper, we unify the Markov theory of a variety of different types of graphs used in graphical Markov models by introducing the class of loopless mixed graphs, and show that all independence models induced by mseparation on such graphs are compositional graphoids. We focus in particular on t ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
In this paper, we unify the Markov theory of a variety of different types of graphs used in graphical Markov models by introducing the class of loopless mixed graphs, and show that all independence models induced by mseparation on such graphs are compositional graphoids. We focus in particular on the subclass of ribbonless graphs which as special cases include undirected graphs, bidirected graphs, and directed acyclic graphs, as well as ancestral graphs and summary graphs. We define maximality of such graphs as well as a pairwise and a global Markov property. We prove that the global and pairwise Markov properties of a maximal ribbonless graph are equivalent for any independence model that is a compositional graphoid.
Halftrek criterion for generic identifiability of linear structural equation models
 ANNALS OF STATISTICS, TO APPEAR
, 2011
"... A linear structural equation model relates random variables of interest and corresponding Gaussian noise terms via a linear equation system. Each such model can be represented by a mixed graph in which directed edges encode the linear equations, and bidirected edges indicate possible correlations ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
A linear structural equation model relates random variables of interest and corresponding Gaussian noise terms via a linear equation system. Each such model can be represented by a mixed graph in which directed edges encode the linear equations, and bidirected edges indicate possible correlations among noise terms. We study parameter identifiability in these models, that is, we ask for conditions that ensure that the edge coefficients and correlations appearing in a linear structural equation model can be uniquely recovered from the covariance matrix of the associated normal distribution. We treat the case of generic identifiability, where unique recovery is possible for almost every choice of parameters. We give a new graphical criterion that is sufficient for generic identifiability. It improves criteria from prior work and does not require the directed part of the graph to be acyclic. We also develop a related necessary condition and examine the “gap ” between sufficient and necessary conditions through simulations as well as exhaustive algebraic computations for graphs with up to five nodes.
Changing parameters by partial mappings
 Statistica Sinica
, 2010
"... Abstract: Changes between different sets of parameters are often needed in multivariate statistical modeling, such as transformations within linear regression or in exponential models. There may, for instance, be specific inference questions based on subject matter interpretations, alternative well ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract: Changes between different sets of parameters are often needed in multivariate statistical modeling, such as transformations within linear regression or in exponential models. There may, for instance, be specific inference questions based on subject matter interpretations, alternative wellfitting constrained models, compatibility judgements of seemingly distinct constrained models, or different reference priors under alternative parameterizations. We introduce and discuss a partial mapping, called partial replication, and relate it to a more complex mapping, called partial inversion. Both operations are used to decompose matrix operations, to explain recursion relations among sets of linear parameters, to change between different types of linear models, to approximate maximumlikelihood estimates in exponential family models under independence constraints, and to switch partially between sets of canonical and moment parameters in exponential family distributions or between sets of corresponding maximumlikelihood estimates. Key words and phrases: Exponential family, independence constraints, matrix op
GRAPHICAL MARKOV MODELS
"... Graphical Markov models are multivariate statistical models which are currently under vigorous development and which combine two simple but most powerful notions, generating processes in single and joint response variables and conditional independences captured by graphs. The development of graphica ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Graphical Markov models are multivariate statistical models which are currently under vigorous development and which combine two simple but most powerful notions, generating processes in single and joint response variables and conditional independences captured by graphs. The development of graphical Markov started with work by Wermuth (1976, 1980) and Darroch, Lauritzen and Speed (1980) which built on early results in 1920 to 1930 by geneticist Sewall Wright and probabilist Andrej Markov as well as on results for loglinear models by Birch (1963), Goodman (1970), Bishop, Fienberg and Holland (1973) and for covariance selection by Dempster (1972). Wright used graphs, in which nodes represent variables and arrows indicate linear dependence, to describe hypotheses about stepwise processes in single responses that could have generated his data. He developed a method, called path analysis, to estimate linear dependences and to judge whether the hypotheses are well compatible with his data which he summarized in terms of simple and partial correlations. With this approach he was far ahead of his time, since corresponding
Segregated Graphs and Marginals of Chain Graph Models
"... Abstract Bayesian networks are a popular representation of asymmetric (for example causal) relationships between random variables. Markov random fields (MRFs) are a complementary model of symmetric relationships used in computer vision, spatial modeling, and social and gene expression networks. A c ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract Bayesian networks are a popular representation of asymmetric (for example causal) relationships between random variables. Markov random fields (MRFs) are a complementary model of symmetric relationships used in computer vision, spatial modeling, and social and gene expression networks. A chain graph model under the LauritzenWermuthFrydenberg interpretation (hereafter a chain graph model) generalizes both Bayesian networks and MRFs, and can represent asymmetric and symmetric relationships together. As in other graphical models, the set of marginals from distributions in a chain graph model induced by the presence of hidden variables forms a complex model. One recent approach to the study of marginal graphical models is to consider a wellbehaved supermodel. Such a supermodel of marginals of Bayesian networks, defined only by conditional independences, and termed the ordinary Markov model, was studied at length in