#### DMCA

## Two Multivariate Generalizations of Pointwise Mutual Information

### Cached

### Download Links

### Citations

10532 |
A mathematical theory of communication
- Shannon
- 1948
(Show Context)
Citation Context ... explore two multivariate generalizations of pointwise mutual information, and explore their usefulness and nature in the extraction of subject verb object triples. 1 Introduction Mutual information (=-=Shannon and Weaver, 1949-=-) is a measure of mutual dependence between two random variables. The measure – and more specifically its instantiation for specific outcomes called pointwise mutual information (PMI) – has proven to ... |

1112 | Word association norms, mutual information, and lexicography
- Church, Hanks
- 1989
(Show Context)
Citation Context ...outcomes called pointwise mutual information (PMI) – has proven to be a useful association measure in numerous natural language processing applications. Since its introduction into the NLP community (=-=Church and Hanks, 1990-=-), it has been used in order to tackle or improve upon several NLP problems, including collocation extraction (ibid.) and word space models (Pantel and Lin, 2002). In its original form, it is restrict... |

292 | Discovering word senses from text
- Pantel, Lin
- 2002
(Show Context)
Citation Context ...introduction into the NLP community (Church and Hanks, 1990), it has been used in order to tackle or improve upon several NLP problems, including collocation extraction (ibid.) and word space models (=-=Pantel and Lin, 2002-=-). In its original form, it is restricted to the analysis of two-way co-occurrences. NLP problems, however, need not be restricted to two-way co-occurrences; often, a particular problem can be more na... |

155 |
Transmission of information
- Fano
- 1949
(Show Context)
Citation Context ... the equation have been swapped. For the three-variable case, this gives exactly the same outcome except for a change in sign. The swap is necessary in order to ensure a proper set-theoretic measure (=-=Fano, 1961-=-; Reza, 1994). 17p(x, y) p(z)p(x, y, z) SI1(x, y, z) = log − log p(x)p(y) p(x, z)p(y, z) p(x, y)p(y, z)p(x, z) = log (8) p(x)p(y)p(z)p(x, y, z) Interaction information – as well as specific interacti... |

129 | Distributional Memory: A General Framework for Corpusbased Semantics
- Baroni, Lenci
- 2010
(Show Context)
Citation Context ...ticular problem can be more naturally tackled when formulated as a multi-way problem. Notably, the framework of tensor decomposition, that has recently permeated into the NLP community (Turney, 2007; =-=Baroni and Lenci, 2010-=-; Giesbrecht, 2010; Van de Cruys, 2010), analyzes language issues as multiway co-occurrences. Up till now, little attention has been devoted to the weighting of such multi-way cooccurrences (which, fo... |

116 |
Multivariate Information Transmission
- McGill
- 1954
(Show Context)
Citation Context ...nce in the corpus. As with PMI, the value for the global case ought to be the expected value for all the instantiations of the specific measure. 3.2.1 Interaction information Interaction information (=-=McGill, 1954-=-) – also called co-information (Bell, 2003) – is based on the notion of conditional mutual information. Conditional mutual information is the mutual information of two random variables conditioned on ... |

110 |
An Introduction to Information Theory
- Reza
- 1961
(Show Context)
Citation Context ...n have been swapped. For the three-variable case, this gives exactly the same outcome except for a change in sign. The swap is necessary in order to ensure a proper set-theoretic measure (Fano, 1961; =-=Reza, 1994-=-). 17p(x, y) p(z)p(x, y, z) SI1(x, y, z) = log − log p(x)p(y) p(x, z)p(y, z) p(x, y)p(y, z)p(x, z) = log (8) p(x)p(y)p(z)p(x, y, z) Interaction information – as well as specific interaction informati... |

45 | The Multi-information function as a tool for measuring stochastic dependence - Studeny, Vejnarova - 1998 |

17 |
Data-driven Identification of Fixed Expressions and Their Modifiability. University of Rijksuniversiteit Groningen dissertation
- Moiron, Begona
- 2005
(Show Context)
Citation Context ...ree variables, this gives the following equation: SI2(x, y, z) = log p(x, y, z) p(x)p(y)p(z) (11) Note that this measure has been used in NLP tasks before, notably for collocation extraction (Villada =-=Moirón, 2005-=-). 4 Application In this section, we explore the performance of the measures defined above in an NLP context, viz. the extraction of salient subject verb object triples. This research has been carried... |

12 | Empirical evaluation of four tensor decomposition algorithms
- Turney
(Show Context)
Citation Context ...; often, a particular problem can be more naturally tackled when formulated as a multi-way problem. Notably, the framework of tensor decomposition, that has recently permeated into the NLP community (=-=Turney, 2007-=-; Baroni and Lenci, 2010; Giesbrecht, 2010; Van de Cruys, 2010), analyzes language issues as multiway co-occurrences. Up till now, little attention has been devoted to the weighting of such multi-way ... |

3 | de Cruys. 2010. A non-negative tensor factorization model for selectional preference induction - Van |