Results 1 - 10
of
41
Interpretation as Abduction
, 1990
"... An approach to abductive inference developed in the TACITUS project has resulted in a dramatic simplification of how the problem of interpreting texts is conceptualized. Its use in solving the local pragmatics problems of reference, compound nominals, syntactic ambiguity, and metonymy is described ..."
Abstract
-
Cited by 687 (38 self)
- Add to MetaCart
An approach to abductive inference developed in the TACITUS project has resulted in a dramatic simplification of how the problem of interpreting texts is conceptualized. Its use in solving the local pragmatics problems of reference, compound nominals, syntactic ambiguity, and metonymy is described and illustrated. It also suggests an elegant and thorough integration of syntax, semantics, and pragmatics. 1
Designing Statistical Language Learners: Experiments on Noun Compounds
, 1995
"... Statistical language learning research takes the view that many traditional natural language processing tasks can be solved by training probabilistic models of language on a sufficient volume of training data. The design of statistical language learners therefore involves answering two questions: (i ..."
Abstract
-
Cited by 95 (0 self)
- Add to MetaCart
Statistical language learning research takes the view that many traditional natural language processing tasks can be solved by training probabilistic models of language on a sufficient volume of training data. The design of statistical language learners therefore involves answering two questions: (i) Which of the multitude of possible language models will most accurately reflect the properties necessary to a given task? (ii) What will constitute a sufficient volume of training data? Regarding the first question, though a variety of successful models have been discovered, the space of possible designs remains largely unexplored. Regarding the second, exploration of the design space has so far proceeded without an adequate answer. The goal of this thesis is to advance the exploration of the statistical language learning design space. In pursuit of that goal, the thesis makes two main theoretical contributions: it identifies a new class of designs by providing a novel theory of statistical natural language processing, and it presents the foundations for a predictive theory of data requirements to assist in future design explorations. The first of these contributions is called the meaning distributions theory. This theory
Collaborating on Referring Expressions
, 1991
"... This paper presents a computational model of how conversational participants collaborate in making referring expressions. The model is based on the planning paradigm. It employs plans for constructing and recognizing referring expressions and meta-plans for constructing and recognizing clarific ..."
Abstract
-
Cited by 83 (9 self)
- Add to MetaCart
This paper presents a computational model of how conversational participants collaborate in making referring expressions. The model is based on the planning paradigm. It employs plans for constructing and recognizing referring expressions and meta-plans for constructing and recognizing clarifications. This allows the model to account for the generation and understanding both of referring expressions and of their clarifications in a uniform framework using a single knowledge base.
CogNiac: A Discourse Processing Engine
, 1995
"... COGNIAC: A DISCOURSE PROCESSING ENGINE Frederick Breckenridge Baldwin Aravind Joshi Ellen Prince In spoken and written language, anaphora occurs when one phrase points to another, where "points to" means that the two phrases denote the same thing in one's mind, as in the relationship ..."
Abstract
-
Cited by 24 (5 self)
- Add to MetaCart
COGNIAC: A DISCOURSE PROCESSING ENGINE Frederick Breckenridge Baldwin Aravind Joshi Ellen Prince In spoken and written language, anaphora occurs when one phrase points to another, where "points to" means that the two phrases denote the same thing in one's mind, as in the relationship between GREGOR SAMSA and he in the following: As GREGOR SAMSA awoke one morning from uneasy dreams he found himself transformed in his bed into a gigantic insect. Kafka, The Metamorphosis. Much of the difficulty of developing a computer program to resolve anaphors amounts to picking the right antecedent when there are many to choose from. The dissertation describes an approach that is particularly suitable for applications that require large coverage and high accuracy. These results are achieved by endowing the system with the ability to notice that it cannot make a good choice in certain circumstances, coupled with simple and efficient language technologies to structure the prior discourse. The significan...
Incremental Interpretation
- Artificial Intelligence
, 1991
"... We present a system for the incremental interpretation of natural-language utterances in context. The main goal of the work is to account for the influences of context on interpretation, while preserving compositionality to the extent possible. To achieve this goal, we introduce a representational d ..."
Abstract
-
Cited by 21 (0 self)
- Add to MetaCart
(Show Context)
We present a system for the incremental interpretation of natural-language utterances in context. The main goal of the work is to account for the influences of context on interpretation, while preserving compositionality to the extent possible. To achieve this goal, we introduce a representational device, conditional interpretations, and a rule system for constructing them. Conditional interpretations represent the potential contributions of phrases to the interpretation of an utterance. The rules specify how phrase interpretations are combined and how they are elaborated with respect to context. The control structure defined by the rules determines the points in the interpretation process at which sufficient information becomes available to carry out specific inferential interpretation steps, such as determining the plausibility of particular referential connections or modifier attachments. We have implemented these ideas in Candide, a system for interactive acquisition of procedural ...
Word Sense Disambiguation and Text Segmentation Based On Lexical Cohesion
, 1994
"... In this paper, we describe how word sense biguity can be resolved wit}t the aid of lexical heston. By checking lexical ('()hesioll between tim current word and lexical chains in the o,(ler of the sMicncc, in tandem with generation of lexical chains we realize incremental word se,se disant hig ..."
Abstract
-
Cited by 20 (1 self)
- Add to MetaCart
In this paper, we describe how word sense biguity can be resolved wit}t the aid of lexical heston. By checking lexical ('()hesioll between tim current word and lexical chains in the o,(ler of the sMicncc, in tandem with generation of lexical chains we realize incremental word se,se disant higuaion based on contextual infin.'mation that lexical chains,reveal. Next, we describe how sea mcnt bonndaries of a text eau be determined with the aid of lexical cohesion. We can measure the plausibiliky of each point in the text as a segmellt boundary t,y COml)uLing a degree of agreement the start and end points of lexical chains.
Issues in the choice of a source for natural language generation
- Computational Linguistics
, 1993
"... The most vexing question in natural language generation is 'what is the source'--what do speakers start from when they begin to compose an utterance? Theories of generation in the literature differ markedly in their assumptions. A few start with an unanalyzed body of numerical data (e.g. B ..."
Abstract
-
Cited by 19 (0 self)
- Add to MetaCart
(Show Context)
The most vexing question in natural language generation is 'what is the source'--what do speakers start from when they begin to compose an utterance? Theories of generation in the literature differ markedly in their assumptions. A few start with an unanalyzed body of numerical data (e.g. Bourbeau et al. 1990; Kukich 1988). Most start with the structured objects that are used by a particular reasoning system or simulator and are cast in that system's representational formalism (e.g. Hovy 1990; Meteer 1992; R6sner 1988). A growing number of systems, largely focused on problems in machine translation or grammatical theory, take their input to be logical formulae based on lexical predicates (e.g. Wedekind 1988; Shieber et al. 1990). The lack of a consistent answer to the question of the generator's source has been at the heart of the problem of how to make research on generation intelligible and engaging for the rest of the computational linguistics community, and has complicated efforts to evaluate alternative treatments even for people in the field. Nevertheless, a source cannot be imposed by fiat. Differences in what information is assumed to be available, its relative decomposition when compared to the "packaging " available in
A Framework for Fast Incremental Interpretation during Speech Decoding
"... This paper describes a framework for incorporating referential semantic information from a world model or ontology directly into a probabilistic language model of the sort commonly used in speech recognition, where it can be probabilistically weighted together with phonological and syntactic factors ..."
Abstract
-
Cited by 17 (4 self)
- Add to MetaCart
(Show Context)
This paper describes a framework for incorporating referential semantic information from a world model or ontology directly into a probabilistic language model of the sort commonly used in speech recognition, where it can be probabilistically weighted together with phonological and syntactic factors as an integral part of the decoding process. Introducing world model referents into the decoding search greatly increases the search space, but by using a single integrated phonological, syntactic, and referential semantic language model, the decoder is able to incrementally prune this search based on probabilities associated with these combined contexts. The result is a single unified referential semantic probability model which brings several kinds of context to bear in speech decoding, and performs accurate recognition in real time on large domains in the absence of example in-domain training sentences. 1
Incremental Interpretation: Applications, Theory, and Relationship to Dynamic Semantics
- In Proceedings of COLING 94
, 1994
"... Why should computers interpret language incrementally? In recent years psycholinguistic evidence for incremental interpretation has become more and more compelling, suggesting that humans perform semantic interpretation before constituent boundaries, possibly word by word. However, possible computat ..."
Abstract
-
Cited by 14 (3 self)
- Add to MetaCart
Why should computers interpret language incrementally? In recent years psycholinguistic evidence for incremental interpretation has become more and more compelling, suggesting that humans perform semantic interpretation before constituent boundaries, possibly word by word. However, possible computational applications have received less attention. In this paper we consider various potential applications, in particular graphical interaction and dialogue. We then review the theoretical and computational tools available for mapping from fragments of sentences to fully scoped semantic representations. Finally, we tease apart the relationship between dynamic semantics and incremental interpretation.
Understanding Natural Language Descriptions of Physical Phenomena
, 2004
"... The fact that human readers can learn about the physical world from textual descriptions leads to a number of interesting questions about the connections between our conceptual understanding of the physical world and how it is reflected in natural language. This thesis investigates some forms in whi ..."
Abstract
-
Cited by 13 (1 self)
- Add to MetaCart
(Show Context)
The fact that human readers can learn about the physical world from textual descriptions leads to a number of interesting questions about the connections between our conceptual understanding of the physical world and how it is reflected in natural language. This thesis investigates some forms in which information about physical phenomena is typically expressed in natural language and how this knowledge can be used to construct models of the underlying physical processes. Based on an analysis of the representations of physical quantities in natural language and common, reoccurring syntactic patterns, we implemented a system that uses Qualitative Process (QP) Theory to guide the semantic interpretation process to capture information about physical phenomena found in natural language text. We have recast QP Theory in terms of frame semantics as FrameNet-compatible representations (QP frames) and use an extendable, controlled subset of English to capture QP specific information from natural language descriptions. In addition to general background knowledge based on a subset of the Cyc knowledge base and the lexical information supplied by a syntactic parser, the semantics of QP Theory are used in rules that guide the semantic interpretation process and the construction of QP Frames. The thesis illustrates that QP Theory, as an established theoretical framework for handling continuous parameters and causation, can provide an essential component of