Results 1  10
of
80
SemiStable Semantics
, 2003
"... In this paper, we examine an argumentbased semantics called semistable semantics. Semistable semantics is quite close to traditional stable semantics in the sense that every stable extension is also a semistable extension. One of the advantages of semistable semantics is that there exists at le ..."
Abstract

Cited by 93 (13 self)
 Add to MetaCart
In this paper, we examine an argumentbased semantics called semistable semantics. Semistable semantics is quite close to traditional stable semantics in the sense that every stable extension is also a semistable extension. One of the advantages of semistable semantics is that there exists at least one semistable extension. Furthermore, if there also exists at least one stable extension, then the semistable extensions coincide with the stable extensions. This, and other properties, make semistable semantics an attractive alternative for the more traditional stable semantics, which until now has been widely used in fields such as logic programming and answer set programming.
Z.: Algorithms for paraconsistent reasoning with OWL
 In: Proceedings of the 4th European Semantic Web Conference (ESWC’07). LNCS
, 2007
"... Abstract. In an open, constantly changing and collaborative environment like the forthcoming Semantic Web, it is reasonable to expect that knowledge sources will contain noise and inaccuracies. Practical reasoning techniques for ontologies therefore will have to be tolerant to this kind of data, in ..."
Abstract

Cited by 48 (21 self)
 Add to MetaCart
(Show Context)
Abstract. In an open, constantly changing and collaborative environment like the forthcoming Semantic Web, it is reasonable to expect that knowledge sources will contain noise and inaccuracies. Practical reasoning techniques for ontologies therefore will have to be tolerant to this kind of data, including the ability to handle inconsistencies in a meaningful way. For this purpose, we employ paraconsistent reasoning based on fourvalued logic, which is a classical method for dealing with inconsistencies in knowledge bases. Its transfer to OWL DL, however, necessitates the making of fundamental design choices in dealing with class inclusion, which has resulted in differing proposals for paraconsistent description logics in the literature. In this paper, we build on one of the more general approaches which due to its flexibility appears to be most promising for further investigations. We present two algorithms suitable for implementation, one based on a preprocessing before invoking a classical OWL reasoner, the other based on a modification of the KAON2 transformation algorithms. We also report on our implementation, called ParOWL. 1
General Patterns for Nonmonotonic Reasoning: From Basic Entailments to Plausible Relations
 Logic Journal of the IGPL
, 1998
"... This paper has two goals. First, we develop frameworks for logical systems which are able to reflect not only nonmonotonic patterns of reasoning, but also paraconsistent reasoning. Our second goal is to have a better understanding of the conditions that a useful relation for nonmonotonic reasoning s ..."
Abstract

Cited by 23 (5 self)
 Add to MetaCart
(Show Context)
This paper has two goals. First, we develop frameworks for logical systems which are able to reflect not only nonmonotonic patterns of reasoning, but also paraconsistent reasoning. Our second goal is to have a better understanding of the conditions that a useful relation for nonmonotonic reasoning should satisfy. For this we consider a sequence of generalizations of the pioneering works of Gabbay, Kraus, Lehmann, Magidor and Makinson. These generalizations allow the use of monotonic nonclassical logics as the underlying logic upon which nonmonotonic reasoning may be based. Our sequence of frameworks culminates in what we call (following Lehmann) plausible, nonmonotonic, multipleconclusion consequence relations (which are based on a given monotonic one). Our study yields intuitive justifications for conditions that have been proposed in previous frameworks and also clarifies the connections among some of these systems. In addition, we present a general method for constructing plausible...
Paraconsistent Declarative Semantics for Extended Logic Programs
 Annals of Mathematics and Artificial Intelligence
, 2002
"... We introduce a fixpoint semantics for logic programs with two kinds of negation:... ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
We introduce a fixpoint semantics for logic programs with two kinds of negation:...
Bilattices and Paraconsistency
, 1997
"... Bilattices are algebraic structures that were introduced by Ginsberg, and further examined by Fitting, as a general framework for many applications in computer science. In this paper we consider their applicability for computerized reasoning in general, and for reasoning with inconsistent data in pa ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
Bilattices are algebraic structures that were introduced by Ginsberg, and further examined by Fitting, as a general framework for many applications in computer science. In this paper we consider their applicability for computerized reasoning in general, and for reasoning with inconsistent data in particular. 1 Background A great deal of research has been devoted in the last twenty years for constructing plausible paraconsistent systems. One of the pioneering works towards this purpose was that of Belnap, who introduced his wellknown fourvalued logic [Be77a, Be77b]. The idea is that in addition to the classical values t, f , two additional truthvalues are introduced for intuitively representing incomplete knowledge. One, denoted here by ?, represents lack of knowledge. The other, ?, denotes "over"knowledge (conflicts). These four elements form a structure called FOUR (see Figure 1). The basic idea is that this structure is "twodimensional"; Each "dimension" corresponds to another ...
Preference modelling
 State of the Art in Multiple Criteria Decision Analysis
, 2005
"... This paper provides the reader with a presentation of preference modelling fundamental notions as well as some recent results in this field. Preference modelling is an inevitable step in a variety of fields: economy, sociology, psychology, mathematical programming, even medicine, archaeology, and ob ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
(Show Context)
This paper provides the reader with a presentation of preference modelling fundamental notions as well as some recent results in this field. Preference modelling is an inevitable step in a variety of fields: economy, sociology, psychology, mathematical programming, even medicine, archaeology, and obviously decision analysis. Our notation and some basic definitions, such as those of binary relation, properties and ordered sets, are presented at the beginning of the paper. We start by discussing different reasons for constructing a model or preference. We then go through a number of issues that influence the construction of preference models. Different formalisations besides classical logic such as fuzzy sets and nonclassical logics become necessary. We then present different types of preference structures reflecting the behavior of a decisionmaker: classical, extended and valued ones. It is relevant to have a numerical representation of preferences: functional representations, value functions. The concepts of thresholds and minimal representation are also introduced in this section. In section 7, we briefly explore the concept of deontic logic (logic of preference) and other formalisms associated with &quot;compact representation of preferences &quot; introduced for special purposes. We end the paper with some concluding remarks.
A Simple and Expressive Semantic Framework for Policy Composition in Access Control
, 2007
"... In defining large, complex access control policies, one would like to compose subpolicies, perhaps authored by different organizations, into a single global policy. Existing policy composition approaches tend to be adhoc, and do not explain whether too many or too few policy combinators have been ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
(Show Context)
In defining large, complex access control policies, one would like to compose subpolicies, perhaps authored by different organizations, into a single global policy. Existing policy composition approaches tend to be adhoc, and do not explain whether too many or too few policy combinators have been defined. We define an access control policy as a fourvalued predicate that maps accesses to either grant, deny, conflict, or unspecified. These correspond to the four elements of the Belnap bilattice. Functions on this bilattice are then extended to policies to serve as policy combinators. We argue that this approach provides a simple and natural semantic framework for policy composition, with a minimal but functionally complete set of policy combinators. We define derived, higherlevel operators that are convenient for the specification of access control policies, and enable the decoupling of conflict resolution from policy composition. Finally, we propose a basic query language and show that it can reduce important analyses (e.g. conflict analysis) to checks of policy refinement.
Measuring inconsistency for description logics based on paraconsistent semantics. Khaled Mellouli, editor
 Proceedings of the 9th European Conference on Symbolic and Quantitative Approaches to Reasoning with Uncertainty, (ECSQARU 2007), Hammamet, Tunisia, October 31  November 2, 2007
, 2007
"... Abstract. In this paper, we present an approach for measuring inconsistency in a knowledge base. We first define the degree of inconsistency using a fourvalued semantics for the description logicALC. Then an ordering over knowledge bases is given by considering their inconsistency degrees. Our meas ..."
Abstract

Cited by 14 (11 self)
 Add to MetaCart
(Show Context)
Abstract. In this paper, we present an approach for measuring inconsistency in a knowledge base. We first define the degree of inconsistency using a fourvalued semantics for the description logicALC. Then an ordering over knowledge bases is given by considering their inconsistency degrees. Our measure of inconsistency can provide important information for inconsistency handling. 1
An epistemic foundation of stable model semantics
, 2003
"... The stable model semantics has become a dominating approach for the management of negation in logic programming. It relies mainly on the closed world assumption to complete the available knowledge and its formulation has its founding root in the socalled GelfondLifschitz transform. The primary goa ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
The stable model semantics has become a dominating approach for the management of negation in logic programming. It relies mainly on the closed world assumption to complete the available knowledge and its formulation has its founding root in the socalled GelfondLifschitz transform. The primary goal of this work is to present an alternative and epistemic based characterization of the stable model semantics, to the GelfondLifschitz transform. In particular, we show that the stable model semantics can be defined entirely as an extension of the KripkeKleene semantics and, thus, (i) does rely on the classical management of negation; and (ii) does not require any program transformation. Indeed, we show that the closed world assumption can be seen as an additional source for ‘falsehood ’ to be added cumulatively to the KripkeKleene semantics. Our approach is purely algebraic and can abstract from the particular formalism of choice as it is based on monotone operators (under the knowledge order) over bilattices only.