Results 1  10
of
61
Relating Defeasible and Normal Logic Programming through Transformation Properties
, 2001
"... This paper relates the Defeasible Logic Programming (DeLP ) framework and its semantics SEM DeLP to classical logic programming frameworks. In DeLP we distinguish between two different sorts of rules: strict and defeasible rules. Negative literals (A) in these rules are considered to represent cl ..."
Abstract

Cited by 76 (31 self)
 Add to MetaCart
This paper relates the Defeasible Logic Programming (DeLP ) framework and its semantics SEM DeLP to classical logic programming frameworks. In DeLP we distinguish between two different sorts of rules: strict and defeasible rules. Negative literals (A) in these rules are considered to represent classical negation. In contrast to this, in normal logic programming (NLP ), there is only one kind of rules, but the meaning of negative literals (notA) is different: they represent a kind of negation as failure, and thereby introduce defeasibility. Various semantics have been defined for NLP, notably the wellfounded semantics WFS and the stable semantics Stable. In this paper we consider the transformation properties for NLP introduced by Brass and Dix and suitably adjusted for the DeLP framework. We show which transformation properties are satisfied, thereby identifying aspects in which NLP and DeLP differ. We contend that the transformation rules presented in this paper can he...
Engineering an Incremental ASP Solver
"... Abstract. Many realworld applications, like planning or model checking, comprise a parameter reflecting the size of a solution. In a propositional formalism like Answer Set Programming (ASP), such problems can only be dealt with in a bounded way, considering one problem instance after another by gr ..."
Abstract

Cited by 51 (19 self)
 Add to MetaCart
(Show Context)
Abstract. Many realworld applications, like planning or model checking, comprise a parameter reflecting the size of a solution. In a propositional formalism like Answer Set Programming (ASP), such problems can only be dealt with in a bounded way, considering one problem instance after another by gradually increasing the bound on the solution size. We thus propose an incremental approach to both grounding and solving in ASP. Our goal is to avoid redundancy by gradually processing the extensions to a problem rather than repeatedly reprocessing the entire (extended) problem. We start by furnishing a formal framework capturing our incremental approach in terms of module theory. In turn, we take advantage of this framework for guiding the successive treatment of program slices during grounding and solving. Finally, we describe the first integrated incremental ASP system, iclingo, and provide an experimental evaluation. 1
TransformationBased BottomUp Computation of the WellFounded Model
, 1997
"... . We present a bottomup algorithm for the computation of the wellfounded model of nondisjunctive logic programs. Our method is based on the elementary program transformations studied by Brass and Dix [6, 7]. However, their "residual program" can grow to exponential size, whereas for fu ..."
Abstract

Cited by 51 (4 self)
 Add to MetaCart
(Show Context)
. We present a bottomup algorithm for the computation of the wellfounded model of nondisjunctive logic programs. Our method is based on the elementary program transformations studied by Brass and Dix [6, 7]. However, their "residual program" can grow to exponential size, whereas for functionfree programs our "program remainder " is always polynomial in the size, i.e. the number of tuples, of the extensional database (EDB). As in the SLGresolution of Chen and Warren [11, 12, 13], we do not only delay negative but also positive literals if they depend on delayed negative literals. When disregarding goaldirectedness, which needs additional concepts, our approach can be seen as a simplified bottomup version of SLGresolution applicable to rangerestricted Datalog programs. Since our approach is also closely related to the alternating fixpoint procedure [27, 28], it can possibly serve as a basis for an integration of the resolutionbased, fixpointbased, and transformationbased ev...
Logic programming revisited: logic programs as inductive definitions
 ACM Transactions on Computational Logic
, 2001
"... Logic programming has been introduced as programming in the Horn clause subset of first order logic. This view breaks down for the negation as failure inference rule. To overcome the problem, one line of research has been to view a logic program as a set of iffdefinitions. A second approach was to ..."
Abstract

Cited by 51 (28 self)
 Add to MetaCart
(Show Context)
Logic programming has been introduced as programming in the Horn clause subset of first order logic. This view breaks down for the negation as failure inference rule. To overcome the problem, one line of research has been to view a logic program as a set of iffdefinitions. A second approach was to identify a unique canonical, preferred or intended model among the models of the program and to appeal to common sense to validate the choice of such model. Another line of research developed the view of logic programming as a nonmonotonic reasoning formalism strongly related to Default Logic and Autoepistemic Logic. These competing approaches have resulted in some confusion about the declarative meaning of logic programming. This paper investigates the problem and proposes an alternative epistemological foundation for the canonical model approach, which is not based on common sense but on a solid mathematical information principle. The thesis is developed that logic programming can be understood as a natural and general logic of inductive definitions. In particular, logic programs with negation represent nonmonotone inductive definitions. It is argued that this thesis results in an alternative justification of the wellfounded model as the unique intended model of the logic program. In addition, it equips logic programs with an easy to comprehend meaning
Simplifying logic programs under uniform and strong equivalence
 In LPNMR’04
, 2004
"... Abstract. We consider the simplification of logic programs under the stablemodel semantics, with respect to the notions of strong and uniform equivalence between logic programs, respectively. Both notions have recently been considered for nonmonotonic logic programs (the latter dates back to the 198 ..."
Abstract

Cited by 42 (21 self)
 Add to MetaCart
(Show Context)
Abstract. We consider the simplification of logic programs under the stablemodel semantics, with respect to the notions of strong and uniform equivalence between logic programs, respectively. Both notions have recently been considered for nonmonotonic logic programs (the latter dates back to the 1980s, though) and provide semantic foundations for optimizing programs with input. Extending previous work, we investigate syntactic and semantic rules for program transformation, based on proper notions of consequence. We furthermore provide encodings of these notions in answerset programming, and give characterizations of programs which are semantically equivalent to positive and Horn programs, respectively. Finally, we investigate the complexity of program simplification and determining semantical equivalence, showing that the problems range between coNP and Π P 2 complexity, and we present some tractable cases. 1
Knowledge Representation with Logic Programs
 DEPT. OF CS OF THE UNIVERSITY OF KOBLENZLANDAU
, 1996
"... In this tutorialoverview, which resulted from a lecture course given by the authors at ..."
Abstract

Cited by 38 (6 self)
 Add to MetaCart
(Show Context)
In this tutorialoverview, which resulted from a lecture course given by the authors at
Characterizations of the Disjunctive Wellfounded Semantics: Confluent Calculi and Iterated GCWA
 Journal of Automated Reasoning
, 1997
"... . Recently Brass and Dix have introduced the semantics DWFS for general disjunctive logic programs. The interesting feature of this approach is that it is both semantically and prooftheoretically founded. Any program \Phi is associated a normalform res(\Phi), called the residual program, by a non ..."
Abstract

Cited by 37 (9 self)
 Add to MetaCart
. Recently Brass and Dix have introduced the semantics DWFS for general disjunctive logic programs. The interesting feature of this approach is that it is both semantically and prooftheoretically founded. Any program \Phi is associated a normalform res(\Phi), called the residual program, by a nontrivial bottomup construction using least fixpoints of two monotonic operators. We show in this paper, that the original calculus, consisting of some simple transformations, has a very strong and appealing property: it is confluent and terminating. This means that all the transformations can be applied in any order: we always arrive at an irreducible program (no more transformation is applicable) and this program is already uniquely determined. Moreover, it coincides with the normalform res(\Phi) of the program we started with. The semantics DWFS can be read off from res(\Phi) immediately. No proper subset of the calculus has these properties  only when we restrict to certain subclasse...
Semantic Forgetting in Answer Set Programming
, 2008
"... The notion of forgetting, also known as variable elimination, has been investigated extensively in the context of classical logic, but less so in (nonmonotonic) logic programming and nonmonotonic reasoning. The few approaches that exist are based on syntactic modifications of a program at hand. In t ..."
Abstract

Cited by 29 (9 self)
 Add to MetaCart
(Show Context)
The notion of forgetting, also known as variable elimination, has been investigated extensively in the context of classical logic, but less so in (nonmonotonic) logic programming and nonmonotonic reasoning. The few approaches that exist are based on syntactic modifications of a program at hand. In this paper, we establish a declarative theory of forgetting for disjunctive logic programs under answer set semantics that is fully based on semantic grounds. The suitability of this theory is justified by a number of desirable properties. In particular, one of our results shows that our notion of forgetting can be entirely captured by classical forgetting. We present several algorithms for computing a representation of the result of forgetting, and provide a characterization of the computational complexity of reasoning from a logic program under forgetting. As applications of our approach, we present a fairly general framework for resolving conflicts in inconsistent knowledge bases that are represented by disjunctive logic programs, and we show how the semantics of inheritance logic programs and update logic programs from the literature can be characterized through forgetting. The basic idea of the conflict resolution framework is to weaken the preferences of each agent by forgetting certain knowledge that causes inconsistency. In particular, we show how to use the notion of forgetting to provide an elegant solution for preference elicitation in disjunctive logic programming.
Heterogeneous Active Agents, III: Polynomially Implementable Agents
 Artificial Intelligence
, 2000
"... In [17], two of the authors have introduced techniques to build agents on top of arbitrary data structures, and to "agentize" new/existing programs. They provided a series of successively more sophisticated semantics for such agent systems, and showed that as these semantics become epis ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
(Show Context)
In [17], two of the authors have introduced techniques to build agents on top of arbitrary data structures, and to "agentize" new/existing programs. They provided a series of successively more sophisticated semantics for such agent systems, and showed that as these semantics become epistemically more desirable, a computational price may need to be paid. In this paper, we identify a class of agents that are called weakly regularthis is done by first identifying a fragment of agent programs [17] called weakly regular agent programs (WRAPs for short).
Prolegomena to Logic Programming for NonMonotonic Reasoning
"... The present prolegomena consist, as all indeed do, in a critical discussion serving to introduce and interpret the extended works that follow in this book. As a result, the book is not a mere collection of excellent papers in their own specialty, but provides also the basics of the motivation, b ..."
Abstract

Cited by 26 (16 self)
 Add to MetaCart
The present prolegomena consist, as all indeed do, in a critical discussion serving to introduce and interpret the extended works that follow in this book. As a result, the book is not a mere collection of excellent papers in their own specialty, but provides also the basics of the motivation, background history, important themes, bridges to other areas, and a common technical platform of the principal formalisms and approaches, augmented with examples. In the