Results 1  10
of
116
Computing Discrete Fréchet Distance
, 1994
"... The Frechet distance between two curves in a metric space is a measure of the similarity between the curves. We present a discrete variation of this measure. It provides
..."
Abstract

Cited by 51 (0 self)
 Add to MetaCart
The Frechet distance between two curves in a metric space is a measure of the similarity between the curves. We present a discrete variation of this measure. It provides
Debugging and semantic clarification by pinpointing
, 2005
"... Abstract. Ontologies are the backbone of the Semantic Web as they allow one to share vocabulary in a semantically sound way. For ontologies, specified in OWL or a related web ontology language, Description Logic reasoner can often detect logical contradictions. Unfortunately, there are two drawbacks ..."
Abstract

Cited by 49 (4 self)
 Add to MetaCart
(Show Context)
Abstract. Ontologies are the backbone of the Semantic Web as they allow one to share vocabulary in a semantically sound way. For ontologies, specified in OWL or a related web ontology language, Description Logic reasoner can often detect logical contradictions. Unfortunately, there are two drawbacks: they lack in support for debugging incoherence in ontologies, and they can only be applied to reasonably expressive ontologies (containing at least some sort of negation). In this paper, we attempt to close these gaps using a technique called pinpointing. In pinpointing we identify minimal sets of axioms which need to be removed or ignored to turn an ontology coherent. We then show how pinpointing can be used for debugging of web ontologies in two typical cases. More unusual is the application of pinpointing in the semantic clarification of underspecified web ontologies which we experimentally evaluate on a number of wellknown webontologies. Our findings are encouraging: even though semantic ambiguity remains an issue, we show that pinpointing can be useful for debugging, and that it can significantly improve the quality of our semantic enrichment in a fully automatic way. 1
E.: Approximating owldl ontologies
 In: Proc. of AAAI 2007 (In Press). (2007
, 2006
"... Abstract. In this poster, we propose to recast the idea of knowledge compilation into approximating OWL DL ontologies with DLLite ontologies, against which query answering has only polynomial data complexity. We identify a useful category of queries for which our approach also guarantees completene ..."
Abstract

Cited by 47 (15 self)
 Add to MetaCart
(Show Context)
Abstract. In this poster, we propose to recast the idea of knowledge compilation into approximating OWL DL ontologies with DLLite ontologies, against which query answering has only polynomial data complexity. We identify a useful category of queries for which our approach also guarantees completeness. Furthermore, we paper report on the implementation of our approach in the ONTOSEARCH2 system. 1
How Hard is it to Revise a Belief Base?
, 1996
"... If a new piece of information contradicts our previously held beliefs, we have to revise our beliefs. This problem of belief revision arises in a number of areas in Computer Science and Artificial Intelligence, e.g., in updating logical database, in hypothetical reasoning, and in machine learning. M ..."
Abstract

Cited by 43 (0 self)
 Add to MetaCart
If a new piece of information contradicts our previously held beliefs, we have to revise our beliefs. This problem of belief revision arises in a number of areas in Computer Science and Artificial Intelligence, e.g., in updating logical database, in hypothetical reasoning, and in machine learning. Most of the research in this area is influenced by work in philosophical logic, in particular by Gardenfors and his colleagues, who developed the theory of belief revision. Here we will focus on the computational aspects of this theory, surveying results that address the issue of the computational complexity of belief revision.
Measuring inconsistency in knowledge via quasiclassical models
 In Proceedings of the National Conference on Artificial Intelligence (AAAI’02
, 2002
"... The language for describing inconsistency is underdeveloped. If a knowledgebase (a set of formulae) is inconsistent, we need more illuminating ways to say how inconsistent it is, or to say whether one knowledgebase is “more inconsistent” than another. To address this, we provide a general characteri ..."
Abstract

Cited by 39 (16 self)
 Add to MetaCart
(Show Context)
The language for describing inconsistency is underdeveloped. If a knowledgebase (a set of formulae) is inconsistent, we need more illuminating ways to say how inconsistent it is, or to say whether one knowledgebase is “more inconsistent” than another. To address this, we provide a general characterization of inconsistency, based on quasiclassical logic (a form of paraconsistent logic with a more expressive semantics than Belnap’s fourvalued logic, and unlike other paraconsistent logics, allows the connectives to appear to behave as classical connectives). We analyse inconsistent knowledge by considering the conflicts arising in the minimal quasiclassical models for that knowledge. This is used for a measure of coherence for each knowledgebase, and for a preference ordering, called the compromise relation, over knowledgebases. In this paper, we formalize this framework, and consider applications in managing heterogeneous sources of knowledge.
Approaches to measuring inconsistent information
 Inconsistency Tolerance. Volume 3300 of Lecture Notes in Computer Science
, 2005
"... Abstract. Measures of quantity of information have been studied extensively for more than fifty years. The seminal work on information theory is by Shannon [67]. This work, based on probability theory, can be used in a logical setting when the worlds are the possible events. This work is also the ba ..."
Abstract

Cited by 36 (9 self)
 Add to MetaCart
(Show Context)
Abstract. Measures of quantity of information have been studied extensively for more than fifty years. The seminal work on information theory is by Shannon [67]. This work, based on probability theory, can be used in a logical setting when the worlds are the possible events. This work is also the basis of Lozinskii’s work [48] for defining the quantity of information of a formula (or knowledgebase) in propositional logic. But this definition is not suitable when the knowledgebase is inconsistent. In this case, it has no classical model, so we have no “event ” to count. This is a shortcoming since in practical applications (e.g. databases) it often happens that the knowledgebase is not consistent. And it is definitely not true that all inconsistent knowledgebases contain the same (null) amount of information, as given by the “classical information theory”. As explored for several years in the paraconsistent logic community, two inconsistent knowledgebases can lead to very different conclusions, showing that they do not convey the same information. There has been some
Semantics and complexity of abduction from default theories (Extended Abstract)
 ARTIFICIAL INTELLIGENCE
, 1997
"... Since logical knowledge representation is commonly based on nonclassical formalisms like default logic, autoepistemic logic, or circumscription, it is necessary to perform abductive reasoning from theories of nonclassical logics. In this paper, we investigate how abduction can be performed from theo ..."
Abstract

Cited by 28 (2 self)
 Add to MetaCart
(Show Context)
Since logical knowledge representation is commonly based on nonclassical formalisms like default logic, autoepistemic logic, or circumscription, it is necessary to perform abductive reasoning from theories of nonclassical logics. In this paper, we investigate how abduction can be performed from theories in default logic. Different modes of abduction are plausible, based on credulous and skeptical default reasoning; they appear useful for different applications such as diagnosis and planning. Moreover, we analyze the complexity of the main abductive reasoning tasks. They are intractable in the general case; we also present known classes of default theories for which abduction is tractable.
On the approximation of instance level update and erasure in description logics
 In Proc. of AAAI 2007
, 2007
"... A Description Logics knowledge base is constituted by two components, called TBox and ABox, where the former expresses general knowledge about the concepts and their relationships, and the latter describes the properties of instances of concepts. We address the problem of how to deal with changes to ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
(Show Context)
A Description Logics knowledge base is constituted by two components, called TBox and ABox, where the former expresses general knowledge about the concepts and their relationships, and the latter describes the properties of instances of concepts. We address the problem of how to deal with changes to a Description Logic knowledge base, when these changes affect only its ABox. We consider two types of changes, namely update and erasure, and we characterize the semantics of these operations on the basis of the approaches proposed by Winslett and by Katsuno and Mendelzon. It is well known that, in general, Description Logics are not closed with respect to updates, in the sense that the set of models corresponding to an update applied to a knowledge base in a Description Logic L may not be expressible by ABoxes in L. We show that this is true also for erasure. To deal with this problem, we introduce the notion of best approximation of an update (erasure) in a DL L, with the goal of characterizing the L ABoxes that capture the update (erasure) at best. We then focus on DLLiteF, a tractable Description Logic, and present polynomial algorithms for computing the best approximation of updates and erasures in this logic, which shows that the nice computational properties of DLLiteF are retained in dealing with the evolution of the ABox.
Measuring Inconsistency through Minimal Inconsistent Sets
"... In this paper, we explore the links between measures of inconsistency for a belief base and the minimal inconsistent subsets of that belief base. The minimal inconsistent subsets can be considered as the relevant part of the base to take into account to evaluate the amount of inconsistency. We defin ..."
Abstract

Cited by 24 (3 self)
 Add to MetaCart
(Show Context)
In this paper, we explore the links between measures of inconsistency for a belief base and the minimal inconsistent subsets of that belief base. The minimal inconsistent subsets can be considered as the relevant part of the base to take into account to evaluate the amount of inconsistency. We define a very natural inconsistency value from these minimal inconsistent sets. Then we show that the inconsistency value we obtain is a particular Shapley Inconsistency Value, and we provide a complete axiomatization of this value in terms of five simple and intuitive axioms. Defining this Shapley Inconsistency Value using the notion of minimal inconsistent subsets allows us to look forward to a viable implementation of this value using SAT solvers.