Results 1 - 10
of
66
Interpretation as Abduction
, 1990
"... An approach to abductive inference developed in the TACITUS project has resulted in a dramatic simplification of how the problem of interpreting texts is conceptualized. Its use in solving the local pragmatics problems of reference, compound nominals, syntactic ambiguity, and metonymy is described ..."
Abstract
-
Cited by 687 (38 self)
- Add to MetaCart
An approach to abductive inference developed in the TACITUS project has resulted in a dramatic simplification of how the problem of interpreting texts is conceptualized. Its use in solving the local pragmatics problems of reference, compound nominals, syntactic ambiguity, and metonymy is described and illustrated. It also suggests an elegant and thorough integration of syntax, semantics, and pragmatics. 1
The Complexity of Logic-Based Abduction
, 1993
"... Abduction is an important form of nonmonotonic reasoning allowing one to find explanations for certain symptoms or manifestations. When the application domain is described by a logical theory, we speak about logic-based abduction. Candidates for abductive explanations are usually subjected to minima ..."
Abstract
-
Cited by 193 (28 self)
- Add to MetaCart
Abduction is an important form of nonmonotonic reasoning allowing one to find explanations for certain symptoms or manifestations. When the application domain is described by a logical theory, we speak about logic-based abduction. Candidates for abductive explanations are usually subjected to minimality criteria such as subsetminimality, minimal cardinality, minimal weight, or minimality under prioritization of individual hypotheses. This paper presents a comprehensive complexity analysis of relevant decision and search problems related to abduction on propositional theories. Our results indicate that abduction is harder than deduction. In particular, we show that with the most basic forms of abduction the relevant decision problems are complete for complexity classes at the second level of the polynomial hierarchy, while the use of prioritization raises the complexity to the third level in certain cases.
Ontological Semantics
, 2004
"... This book introduces ontological semantics, a comprehensive approach to the treatment of text meaning by computer. Ontological semantics is an integrated complex of theories, methodologies, descriptions and implementations. In ontological semantics, a theory is viewed as a set of statements determin ..."
Abstract
-
Cited by 128 (38 self)
- Add to MetaCart
This book introduces ontological semantics, a comprehensive approach to the treatment of text meaning by computer. Ontological semantics is an integrated complex of theories, methodologies, descriptions and implementations. In ontological semantics, a theory is viewed as a set of statements determining the format of descriptions of the phenomena with which the theory deals. A theory is associated with a methodology used to obtain the descriptions. Implementations are computer systems that use the descriptions to solve specific problems in text processing. Implementations of ontological semantics are combined with other processing systems to produce applications, such as information extraction or machine translation. The theory of ontological semantics is built as a society of microtheories covering such diverse ground as specific language phenomena, world knowledge organization, processing heuristics and issues relating to knowledge representation and implementation system architecture. The theory briefly sketched above is a top-level microtheory, the ontological semantics theory per se. Descriptions in ontological semantics include text meaning representations, lexical entries, ontological concepts and instances as well as procedures for manipulating texts and their meanings. Methodologies in ontological semantics are sets of techniques and instructions for acquiring and
Designing Statistical Language Learners: Experiments on Noun Compounds
, 1995
"... Statistical language learning research takes the view that many traditional natural language processing tasks can be solved by training probabilistic models of language on a sufficient volume of training data. The design of statistical language learners therefore involves answering two questions: (i ..."
Abstract
-
Cited by 95 (0 self)
- Add to MetaCart
Statistical language learning research takes the view that many traditional natural language processing tasks can be solved by training probabilistic models of language on a sufficient volume of training data. The design of statistical language learners therefore involves answering two questions: (i) Which of the multitude of possible language models will most accurately reflect the properties necessary to a given task? (ii) What will constitute a sufficient volume of training data? Regarding the first question, though a variety of successful models have been discovered, the space of possible designs remains largely unexplored. Regarding the second, exploration of the design space has so far proceeded without an adequate answer. The goal of this thesis is to advance the exploration of the statistical language learning design space. In pursuit of that goal, the thesis makes two main theoretical contributions: it identifies a new class of designs by providing a novel theory of statistical natural language processing, and it presents the foundations for a predictive theory of data requirements to assist in future design explorations. The first of these contributions is called the meaning distributions theory. This theory
On the Role of Coherence in Abductive Explanation
- AAAI-90
"... Abduction is an important inference process underlying much of human intelligent activities, including text understanding, plan recognition, disease diagnosis, and physical device diagnosis. In this paper, we describe some problems encountered using abduction to understand text, and present some sol ..."
Abstract
-
Cited by 54 (7 self)
- Add to MetaCart
Abduction is an important inference process underlying much of human intelligent activities, including text understanding, plan recognition, disease diagnosis, and physical device diagnosis. In this paper, we describe some problems encountered using abduction to understand text, and present some solutions to overcome these problems. The solutions we propose center around the use of a different criterion, called explanatory coherence, as the primary measure to evaluate the quality of an explanation. In addition, explanatory coherence plays an important role in the construction of explanations, both in determining the appropriate level of specificity of a preferred explanation, and in guiding the heuristic search to efficiently compute explanations of sufficiently high quality.
Focusing Construction and Selection of Abductive Hypotheses
- In IJCAI '93
, 1993
"... Many abductive understanding systems explain novel situations by a chaining process that is neutral to explainer needs beyond generating some plausible explanation for the event being explained. This paper examines the relationship of standard models of abductive understanding to the case-based exp ..."
Abstract
-
Cited by 26 (0 self)
- Add to MetaCart
Many abductive understanding systems explain novel situations by a chaining process that is neutral to explainer needs beyond generating some plausible explanation for the event being explained. This paper examines the relationship of standard models of abductive understanding to the case-based explanation model. In case-based explanation, construction and selection of abductive hypotheses are focused by specific explanations of prior episodes and by goal-based criteria reflecting current information needs. The case-based method is inspired by observations of human explanation of anomalous events during everyday understanding, and this paper focuses on the method's contributions to the problems of building good explanations in everyday domains. We identify five central issues, compare how those issues are addressed in traditional and case-based explanation models, and discuss motivations for using the case-based approach to facilitate generation of plausible and useful explanations in...
A Conceptual Reasoning Approach to Textual Ellipsis
- IN PROCEEDINGS OF THE 12TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE
, 1996
"... We present a hybrid text understanding methodology for the resolution of textual ellipsis. It integrates conceptual criteria (based on the well-formedness and conceptual strength of role chains in a terminological knowledge base) and functional constraints reecting the utterances' information s ..."
Abstract
-
Cited by 26 (16 self)
- Add to MetaCart
We present a hybrid text understanding methodology for the resolution of textual ellipsis. It integrates conceptual criteria (based on the well-formedness and conceptual strength of role chains in a terminological knowledge base) and functional constraints reecting the utterances' information structure (based on the distinction between context-bound and unbound discourse elements). The methodological framework for text ellipsis resolution is the centering model that has been adapted to these constraints.
Evaluation of Explanatory Hypotheses
, 1991
"... Abduction is often viewed as inference to the "best" explanation. However, the evaluation of the goodness of candidate hypotheses remains an open problem. Most artificial intelligence research addressing this problem has concentrated on syntactic criteria, applied uniformly regardless of t ..."
Abstract
-
Cited by 20 (9 self)
- Add to MetaCart
Abduction is often viewed as inference to the "best" explanation. However, the evaluation of the goodness of candidate hypotheses remains an open problem. Most artificial intelligence research addressing this problem has concentrated on syntactic criteria, applied uniformly regardless of the explainer's intended use for the explanation. We demonstrate that syntactic approaches are insufficient to capture important differences in explanations, and propose instead that choice of the "best" explanation should be based on explanations' utility for the explainer 's purpose. We describe two classes of goals motivating explanation: knowledge goals reflecting internal desires for information, and goals to accomplish tasks in the external world. We describe how these goals impose requirements on explanations, and discuss how we apply those requirements to evaluate hypotheses in two computer story understanding systems. In order to learn from experience, a reasoner must be able to explain what...
Concurrent lexicalized dependency parsing: the ParseTalk model
- COLING ‘94: Proc. 15th Intl. Conf. on Computational Linguistics (this volume
, 1994
"... Abstract. A grammar model for concurrent, object-oriented natural language parsing is introduced. Complete lexical distribution of grammatical knowledge is achieved building upon the head-oriented notions of valency and dependency, while inheritance mechanisms are used to capture lexical generalizat ..."
Abstract
-
Cited by 18 (9 self)
- Add to MetaCart
Abstract. A grammar model for concurrent, object-oriented natural language parsing is introduced. Complete lexical distribution of grammatical knowledge is achieved building upon the head-oriented notions of valency and dependency, while inheritance mechanisms are used to capture lexical generalizations. The underlying concurrent computation model relies upon the actor paradigm. We consider message passing protocols for establishing dependency relations and ambiguity handling. 1