Results 1  10
of
37
A Survey on Knowledge Compilation
, 1998
"... this paper we survey recent results in knowledge compilation of propositional knowledge bases. We first define and limit the scope of such a technique, then we survey exact and approximate knowledge compilation methods. We include a discussion of compilation for nonmonotonic knowledge bases. Keywor ..."
Abstract

Cited by 119 (4 self)
 Add to MetaCart
this paper we survey recent results in knowledge compilation of propositional knowledge bases. We first define and limit the scope of such a technique, then we survey exact and approximate knowledge compilation methods. We include a discussion of compilation for nonmonotonic knowledge bases. Keywords: Knowledge Representation, Efficiency of Reasoning
The comparative linguistics of knowledge representation
 In Proc. of IJCAI’95
, 1995
"... We develop a methodology for comparing knowledge representation formalisms in terms of their "representational succinctness, " that is, their ability to express knowledge situations relatively efficiently. We use this framework for comparing many important formalisms for knowledge base rep ..."
Abstract

Cited by 70 (2 self)
 Add to MetaCart
We develop a methodology for comparing knowledge representation formalisms in terms of their "representational succinctness, " that is, their ability to express knowledge situations relatively efficiently. We use this framework for comparing many important formalisms for knowledge base representation: propositional logic, default logic, circumscription, and model preference defaults; and, at a lower level, Horn formulas, characteristic models, decision trees, disjunctive normal form, and conjunctive normal form. We also show that adding new variables improves the effective expressibility of certain knowledge representation formalisms. 1
Is Intractability of NonMonotonic Reasoning a Real Drawback?
 Artificial Intelligence
, 1996
"... Several studies about computational complexity of nonmonotonic reasoning (NMR) showed that nonmonotonic inference is significantly harder than classical, monotonic inference. This contrasts with the general idea that NMR can be used to make knowledge representation and reasoning simpler, not harde ..."
Abstract

Cited by 48 (9 self)
 Add to MetaCart
Several studies about computational complexity of nonmonotonic reasoning (NMR) showed that nonmonotonic inference is significantly harder than classical, monotonic inference. This contrasts with the general idea that NMR can be used to make knowledge representation and reasoning simpler, not harder. In this paper we show that, to some extent, NMR fulfills the representation goal. In particular, we prove that nonmonotonic formalisms such as circumscription and default logic allow for a much more compact and natural representation of propositional knowledge than propositional calculus. Proofs are based on a suitable definition of compilable inference problem, and on nonuniform complexity classes. Some results about intractability of circumscription and default logic can therefore be interpreted as the price one has to pay for having such an extracompact representation. On the other hand, intractability of inference and compactness of representation are not equivalent notions: we ex...
The Size of a Revised Knowledge Base
 Artificial Intelligence
, 1995
"... In this paper we address a specific computational aspect of belief revision: The size of the propositional formula obtained by means of the revision of a formula with a new one. In particular, we focus on the size of the smallest formula equivalent to the revised knowledge base. The main result of t ..."
Abstract

Cited by 46 (19 self)
 Add to MetaCart
In this paper we address a specific computational aspect of belief revision: The size of the propositional formula obtained by means of the revision of a formula with a new one. In particular, we focus on the size of the smallest formula equivalent to the revised knowledge base. The main result of this paper is that not all formalizations of belief revision are equal from this point of view. For some of them we show that the revised knowledge base can be expressed with a formula admitting a polynomialspace representation (we call these results "compactability" results). On the other hand we are able to prove that for other ones the revised knowledge base does not always admit a polynomialspace representation, unless the polynomial hierarchy collapses at a sufficiently low level ("noncompactability" results). The time complexity of query answering for the revised knowledge base has definitely an impact on being able to represent the result of the revision compactly. Nevertheless form...
Preprocessing of Intractable Problems
 Information and Computation
, 1997
"... Some computationally hard problems e.g., deduction in logical knowledge bases are such that part of an instance is known well before the rest of it, and remains the same for several subsequent instances of the problem. In these cases, it is meaningful to preprocess offline this known part so as ..."
Abstract

Cited by 46 (15 self)
 Add to MetaCart
Some computationally hard problems e.g., deduction in logical knowledge bases are such that part of an instance is known well before the rest of it, and remains the same for several subsequent instances of the problem. In these cases, it is meaningful to preprocess offline this known part so as to simplify the remaining online problem. In this paper we investigate such a technique in the context of intractable, i.e., NPhard, problems. Recent results in the literature show that not all NPhard problems behave in the same way: for some of them preprocessing yields polynomialtime online simplified problems (we call them compilable), while for other ones there is strong evidence that this should not happen. Our primary goal is to provide a sound methodology that can be used either to prove or disprove that a problem is compilable. To this end, we define new models of computation, complexity classes, and reductions. We find complete problems for such classes, completeness meaning...
Knowledge Representation with Logic Programs
 DEPT. OF CS OF THE UNIVERSITY OF KOBLENZLANDAU
, 1996
"... In this tutorialoverview, which resulted from a lecture course given by the authors at ..."
Abstract

Cited by 38 (6 self)
 Add to MetaCart
In this tutorialoverview, which resulted from a lecture course given by the authors at
Non Monotonic Reasoning
, 1997
"... These are the proceedings of the 11th Nonmonotonic Reasoning Workshop. The aim of this series ..."
Abstract

Cited by 34 (1 self)
 Add to MetaCart
These are the proceedings of the 11th Nonmonotonic Reasoning Workshop. The aim of this series
Space Efficiency of Propositional Knowledge Representation Formalisms
 IN PROCEEDINGS OF THE FIFTH INTERNATIONAL CONFERENCE ON THE PRINCIPLES OF KNOWLEDGE REPRESENTATION AND REASONING (KR'96
, 2000
"... We investigate the space efficiency of a Propositional Knowledge Representation (PKR) formalism. Intuitively, the space efficiency of a formalism F in representing a certain piece of knowledge #, is the size of the shortest formula of F that represents #. In this paper we assume that knowledge is ..."
Abstract

Cited by 31 (5 self)
 Add to MetaCart
(Show Context)
We investigate the space efficiency of a Propositional Knowledge Representation (PKR) formalism. Intuitively, the space efficiency of a formalism F in representing a certain piece of knowledge #, is the size of the shortest formula of F that represents #. In this paper we assume that knowledge is either a set of propositional interpretations (models) or a set of propositional formulae (theorems). We provide a formal way of talking about the relative ability of PKR formalisms to compactly represent a set of models or a set of theorems. We introduce two new compactness measures, the corresponding classes, and show that the relative space efficiency of a PKR formalism in representing models/theorems is directly related to such classes. In particular, we consider formalisms for nonmonotonic reasoning, such as circumscription and default logic, as well as belief revision operators and the stable model semantics for logic programs with negation. One interesting result is that formalisms ...
Reducing Belief Revision to Circumscription (and viceversa)
, 2002
"... Nonmonotonic formalisms and belief revision operators have been introduced as useful tools to describe and reason about evolving scenarios. Both approaches have been proven effective in a number of different situations. However, little is known about their relationship. Previous work by Winslett ha ..."
Abstract

Cited by 22 (4 self)
 Add to MetaCart
Nonmonotonic formalisms and belief revision operators have been introduced as useful tools to describe and reason about evolving scenarios. Both approaches have been proven effective in a number of different situations. However, little is known about their relationship. Previous work by Winslett has shown some correlations between a specific operator and circumscription. In this paper we greatly extend Winslett’s work by establishing new relations between circumscription and a large number of belief revision operators. This highlights similarities and differences between these formalisms. Furthermore, these connections provide us with the possibility of importing results in one field into the other one.
Compiling Propositional Weighted Bases
, 2004
"... In this paper, we investigate the extent to which knowledge compilation can be used to improve model checking and inference from propositional weighted bases. We first focus on the compilability issue for both problems, deriving mainly noncompilability results in the case preferences are subject to ..."
Abstract

Cited by 21 (7 self)
 Add to MetaCart
In this paper, we investigate the extent to which knowledge compilation can be used to improve model checking and inference from propositional weighted bases. We first focus on the compilability issue for both problems, deriving mainly noncompilability results in the case preferences are subject to change. Then, we present a general notion of ¤normal weighted base that is parametrized by a tractable class ¤ for the clausal entailment problem. We show how every weighted base can be turned (“compiled”) into a queryequivalent ¤normal base whenever ¤ is a complete class for propositional logic. Both negative and positive results are presented. On the one hand, complexity results are identified, showing that the inference problem from a ¤normal weighted base is as difficult as in the general case, when the prime implicates, Horn cover or renamable Horn cover classes are targeted. On the other hand, we show that both the model checking and the (clausal) inference problem become tractable whenever ¥§¦¨¦� ©normal bases are considered. Moreover, we show that the set of all preferred models of a ¥�¦¨¦� ©normal weighted base can be computed in time polynomial in the output size, and as a consequence, model checking is also tractable for such bases. Finally, we sketch how our results can be used in model–based diagnosis in order to compute the most likely diagnoses of a system.