Results 1  10
of
119
A Knowledge Compilation Map
 Journal of Artificial Intelligence Research
, 2002
"... We propose a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime. ..."
Abstract

Cited by 219 (33 self)
 Add to MetaCart
(Show Context)
We propose a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime.
Decomposable negation normal form
 JOURNAL OF THE ACM
, 2001
"... Knowledge compilation has been emerging recently as a new direction of research for dealing with the computational intractability of general propositional reasoning. According to this approach, the reasoning process is split into two phases: an offline compilation phase and an online queryanswer ..."
Abstract

Cited by 128 (17 self)
 Add to MetaCart
(Show Context)
Knowledge compilation has been emerging recently as a new direction of research for dealing with the computational intractability of general propositional reasoning. According to this approach, the reasoning process is split into two phases: an offline compilation phase and an online queryanswering phase. In the offline phase, the propositional theory is compiled into some target language, which is typically a tractable one. In the online phase, the compiled target is used to efficiently answer a (potentially) exponential number of queries. The main motivation behind knowledge compilation is to push as much of the computational overhead as possible into the offline phase, in order to amortize that overhead over all online queries. Another motivation behind compilation is to produce very simple online reasoning systems, which can be embedded costeffectively into primitive computational platforms, such as those found in consumer electronics. One of the key aspects of any compilation approach is the target language into which the propositional theory is compiled. Previous target languages included Horn theories, prime implicates/implicants and ordered binary decision diagrams (OBDDs). We propose in this paper a new target compilation language, known as decomposable negation normal form (DNNF), and present a number of its properties that make it of interest to the broad community. Specifically, we
On the compilability and expressive power of propositional planning formalisms
, 1998
"... The recent approaches of extending the GRAPHPLAN algorithm to handle more expressive planning formalisms raise the question of what the formal meaning of “expressive power ” is. We formalize the intuition that expressive power is a measure of how concisely planning domains and plans can be expressed ..."
Abstract

Cited by 89 (11 self)
 Add to MetaCart
(Show Context)
The recent approaches of extending the GRAPHPLAN algorithm to handle more expressive planning formalisms raise the question of what the formal meaning of “expressive power ” is. We formalize the intuition that expressive power is a measure of how concisely planning domains and plans can be expressed in a particular formalism by introducing the notion of “compilation schemes ” between planning formalisms. Using this notion, we analyze the expressiveness of a large family of propositional planning formalisms, ranging from basic STRIPS to a formalism with conditional effects, partial state specifications, and propositional formulae in the preconditions. One of the results is that conditional effects cannot be compiled away if plan size should grow only linearly but can be compiled away if we allow for polynomial growth of the resulting plans. This result confirms that the recently proposed extensions to the GRAPHPLAN algorithm concerning conditional effects are optimal with respect to the “compilability ” framework. Another result is that general propositional formulae cannot be compiled into conditional effects if the plan size should be preserved linearly. This implies that allowing general propositional formulae in preconditions and effect conditions adds another level of difficulty in generating a plan.
Constraint propagation
 Handbook of Constraint Programming
, 2006
"... Constraint propagation is a form of inference, not search, and as such is more ”satisfying”, both technically and aesthetically. —E.C. Freuder, 2005. Constraint reasoning involves various types of techniques to tackle the inherent ..."
Abstract

Cited by 76 (5 self)
 Add to MetaCart
(Show Context)
Constraint propagation is a form of inference, not search, and as such is more ”satisfying”, both technically and aesthetically. —E.C. Freuder, 2005. Constraint reasoning involves various types of techniques to tackle the inherent
On the Tractable Counting of Theory Models and its Application to Truth Maintenance and Belief Revision
 Journal of Applied NonClassical Logics
, 2000
"... We address the problem of counting the models of a propositional theory, under incremental changes to the theory. Specifically, we show that if a propositional theory is in a special form that we call smooth, deterministic, decomposable negation normal form (sdDNNF), then for any consistent set of ..."
Abstract

Cited by 58 (18 self)
 Add to MetaCart
We address the problem of counting the models of a propositional theory, under incremental changes to the theory. Specifically, we show that if a propositional theory is in a special form that we call smooth, deterministic, decomposable negation normal form (sdDNNF), then for any consistent set of literals S, we can simultaneously count, in time linear in the size of , the models of: [ S; [ S [ flg: for every literal l 62 S; [ S n flg: for every literal l 2 S; [ S n flg [ f:lg: for every literal l 2 S.
Integrity and Change in Modular Ontologies
, 2003
"... The benefits of modular representations are well known from many areas of computer science. In this paper, we concentrate on the benefits of modular ontologies with respect to local containment of terminological reasoning. We define an architecture for modular ontologies that supports local re ..."
Abstract

Cited by 49 (12 self)
 Add to MetaCart
The benefits of modular representations are well known from many areas of computer science. In this paper, we concentrate on the benefits of modular ontologies with respect to local containment of terminological reasoning. We define an architecture for modular ontologies that supports local reasoning by compiling implied subsumption relations.
Principles and applications of continual computation
 Artificial Intelligence J
"... Automated problem solving is viewed typically as the allocation of computational resources to solve one or more problems passed to a reasoning system. In response to each problem received, eort is applied in real time to generate a solution and problem solving ends when a solution is rendered. We ..."
Abstract

Cited by 47 (10 self)
 Add to MetaCart
(Show Context)
Automated problem solving is viewed typically as the allocation of computational resources to solve one or more problems passed to a reasoning system. In response to each problem received, eort is applied in real time to generate a solution and problem solving ends when a solution is rendered. We examine continual computation, reasoning policies that capture a broader conception of problem by considering the proactive allocation of computational resources to potential future challenges. We explore policies for allocating idle time for several settings and present applications that highlight opportunities for harnessing continual computation in realworld tasks.
Preprocessing of Intractable Problems
 Information and Computation
, 1997
"... Some computationally hard problems e.g., deduction in logical knowledge bases are such that part of an instance is known well before the rest of it, and remains the same for several subsequent instances of the problem. In these cases, it is meaningful to preprocess offline this known part so as ..."
Abstract

Cited by 46 (15 self)
 Add to MetaCart
Some computationally hard problems e.g., deduction in logical knowledge bases are such that part of an instance is known well before the rest of it, and remains the same for several subsequent instances of the problem. In these cases, it is meaningful to preprocess offline this known part so as to simplify the remaining online problem. In this paper we investigate such a technique in the context of intractable, i.e., NPhard, problems. Recent results in the literature show that not all NPhard problems behave in the same way: for some of them preprocessing yields polynomialtime online simplified problems (we call them compilable), while for other ones there is strong evidence that this should not happen. Our primary goal is to provide a sound methodology that can be used either to prove or disprove that a problem is compilable. To this end, we define new models of computation, complexity classes, and reductions. We find complete problems for such classes, completeness meaning...
A Perspective on Knowledge Compilation
 In Proc. International Joint Conference on Artificial Intelligence (IJCAI
, 2001
"... We provide a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime. We argue that ..."
Abstract

Cited by 30 (8 self)
 Add to MetaCart
We provide a perspective on knowledge compilation which calls for analyzing different compilation approaches according to two key dimensions: the succinctness of the target compilation language, and the class of queries and transformations that the language supports in polytime. We argue that such analysis is necessary for placing new compilation approaches within the context of existing ones.
Distancebased merging: A general framework and some complexity results
, 2001
"... The importance of belief merging is reflected by the abundance of the literature about it for the last years. In the following, a model for belief merging based on distances is introduced; many merging operators already pointed out so far can be recovered as specific instances of this model. We inve ..."
Abstract

Cited by 27 (9 self)
 Add to MetaCart
The importance of belief merging is reflected by the abundance of the literature about it for the last years. In the following, a model for belief merging based on distances is introduced; many merging operators already pointed out so far can be recovered as specific instances of this model. We investigate the computational aspects of such distancebased operators and give two general results showing that the complexity of inference for them is at the first level of the polynomial hierarchy (under very weak assumptions). Then some specific distancebased operators are considered and their complexity is identified. Finally, distancebased merging operators are investigated from the logical point of view.