Results 11  20
of
37
Changing parameters by partial mappings
 Statistica Sinica
, 2010
"... Abstract: Changes between different sets of parameters are often needed in multivariate statistical modeling, such as transformations within linear regression or in exponential models. There may, for instance, be specific inference questions based on subject matter interpretations, alternative well ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Abstract: Changes between different sets of parameters are often needed in multivariate statistical modeling, such as transformations within linear regression or in exponential models. There may, for instance, be specific inference questions based on subject matter interpretations, alternative wellfitting constrained models, compatibility judgements of seemingly distinct constrained models, or different reference priors under alternative parameterizations. We introduce and discuss a partial mapping, called partial replication, and relate it to a more complex mapping, called partial inversion. Both operations are used to decompose matrix operations, to explain recursion relations among sets of linear parameters, to change between different types of linear models, to approximate maximumlikelihood estimates in exponential family models under independence constraints, and to switch partially between sets of canonical and moment parameters in exponential family distributions or between sets of corresponding maximumlikelihood estimates. Key words and phrases: Exponential family, independence constraints, matrix op
Logmean linear models for binary data
 Biometrika
, 2013
"... This paper is devoted to the theory and application of a novel class of models for binary data, which we call logmean linear (LML) models. The characterizing feature of these models is that they are specified by linear constraints on the LML parameter, defined as a loglinear expansion of the mean ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
This paper is devoted to the theory and application of a novel class of models for binary data, which we call logmean linear (LML) models. The characterizing feature of these models is that they are specified by linear constraints on the LML parameter, defined as a loglinear expansion of the mean parameter of the multivariate Bernoulli distribution. We show that marginal independence relationships between variables can be specified by setting certain LML interactions to zero and, more specifically, that graphical models of marginal independence are LML models. LML models are code dependent, in the sense that they are not invariant with respect to relabelling of variable values. As a consequence, they allow us to specify submodels defined by codespecific independencies, which are independencies in subpopulations of interest. This special feature of LML models has useful applications. Firstly, it provides a flexible way to specify parsimonious submodels of marginal independence models. The main advantage of this approach concerns the interpretation of the submodel, which is fully characterized by independence relationships, either marginal or codespecific. Secondly, the codespecific nature of these models can be exploited to focus on a fixed, arbitrary, cell of the probability table and on the corresponding subpopulation. This leads to an innovative family of models, which we call pivotal codespecific LML models, that is especially useful when the interest of researchers is focused on a small subpopulation obtained by stratifying individuals according to some features. The application of LML models is illustrated on two datasets, one of which concerns the use of pivotal codespecific LML models in the field of personalized medicine.
Iterative conditional fitting for discrete chain graph models
 In COMPSTAT 2008 – Proceedings in Computational Statistics 93–104
, 2008
"... Abstract. ‘Iterative conditional fitting ’ is a recently proposed algorithm that can be used for maximization of the likelihood function in marginal independence models for categorical data. This paper describes a modification of this algorithm, which allows one to compute maximum likelihood estimat ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract. ‘Iterative conditional fitting ’ is a recently proposed algorithm that can be used for maximization of the likelihood function in marginal independence models for categorical data. This paper describes a modification of this algorithm, which allows one to compute maximum likelihood estimates in a class of chain graph models for categorical data. The considered discrete chain graph models are defined using conditional independence relations arising in recursive multivariate regressions with correlated errors. This Markov interpretation of the chain graph is consistent with treating the graph as a path diagram and differs from other interpretations known as the LWF and AMP Markov properties.
Marginal AMP Chain Graphs
 INTERNATIONAL JOURNAL OF APPROXIMATE REASONING
, 2014
"... We present a new family of models that is based on graphs that may have undirected, directed and bidirected edges. We name these new models marginal AMP (MAMP) chain graphs because each of them is Markov equivalent to some AMP chain graph under marginalization of some of its nodes. However, MAMP c ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
We present a new family of models that is based on graphs that may have undirected, directed and bidirected edges. We name these new models marginal AMP (MAMP) chain graphs because each of them is Markov equivalent to some AMP chain graph under marginalization of some of its nodes. However, MAMP chain graphs do not only subsume AMP chain graphs but also multivariate regression chain graphs. We describe global and pairwise Markov properties for MAMP chain graphs and prove their equivalence for compositional graphoids. We also characterize when two MAMP chain graphs are Markov equivalent. For Gaussian probability distributions, we also show that every MAMP chain graph is Markov equivalent to some directed and acyclic graph with deterministic nodes under marginalization and conditioning on some of its nodes. This is important because it implies that the independence model represented by a MAMP chain graph can be accounted for by some data generating process that is partially observed and has selection bias. Finally, we modify MAMP chain graphs so that they are closed under marginalization for Gaussian probability distributions. This is a desirable feature because it guarantees parsimonious models under marginalization.
GRAPHICAL MARKOV MODELS
"... Graphical Markov models are multivariate statistical models which are currently under vigorous development and which combine two simple but most powerful notions, generating processes in single and joint response variables and conditional independences captured by graphs. The development of graphica ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Graphical Markov models are multivariate statistical models which are currently under vigorous development and which combine two simple but most powerful notions, generating processes in single and joint response variables and conditional independences captured by graphs. The development of graphical Markov started with work by Wermuth (1976, 1980) and Darroch, Lauritzen and Speed (1980) which built on early results in 1920 to 1930 by geneticist Sewall Wright and probabilist Andrej Markov as well as on results for loglinear models by Birch (1963), Goodman (1970), Bishop, Fienberg and Holland (1973) and for covariance selection by Dempster (1972). Wright used graphs, in which nodes represent variables and arrows indicate linear dependence, to describe hypotheses about stepwise processes in single responses that could have generated his data. He developed a method, called path analysis, to estimate linear dependences and to judge whether the hypotheses are well compatible with his data which he summarized in terms of simple and partial correlations. With this approach he was far ahead of his time, since corresponding
Graphical Markov models: overview
, 2015
"... AbstractWe describe how graphical Markov models emerged in the last 40 years, based on three essential concepts that had been developed independently more than a century ago. Sequences of joint or single regressions and their regression graphs are singled out as being the subclass that is best suite ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
AbstractWe describe how graphical Markov models emerged in the last 40 years, based on three essential concepts that had been developed independently more than a century ago. Sequences of joint or single regressions and their regression graphs are singled out as being the subclass that is best suited for analyzing longitudinal data and for tracing developmental pathways, both in observational and in intervention studies. Interpretations are illustrated using two sets of data. Furthermore, some of the more recent, important results for sequences of regressions are summarized. 1 Some general and historical remarks on the types of model Graphical models aim to describe in concise form the possibly complex interrelations between a set of variables so that key properties can be read directly o ↵ a graph. The central idea is that each variable is represented by a node in a graph. Any pair of nodes may become coupled, that is joined by an edge. Coupled nodes are also said to be adjacent. For many types of graph, a missing edge represents some form of conditional independence between the pair of variables and an edge present can be interpreted as a corresponding conditional dependence. Because the conditioning set may be empty,
Dichotomization invariant logmean linear parameterization for discrete graphical models
, 2013
"... of marginal independence ..."
(Show Context)
Every LWF and AMP Chain Graph Originates From a Set of Causal Models. ArXiv eprints
, 2013
"... ar ..."