Results 1 - 10
of
958
Data Integration: A Theoretical Perspective
- Symposium on Principles of Database Systems
, 2002
"... Data integration is the problem of combining data residing at different sources, and providing the user with a unified view of these data. The problem of designing data integration systems is important in current real world applications, and is characterized by a number of issues that are interestin ..."
Abstract
-
Cited by 965 (45 self)
- Add to MetaCart
Data integration is the problem of combining data residing at different sources, and providing the user with a unified view of these data. The problem of designing data integration systems is important in current real world applications, and is characterized by a number of issues that are interesting from a theoretical point of view. This document presents on overview of the material to be presented in a tutorial on data integration. The tutorial is focused on some of the theoretical issues that are relevant for data integration. Special attention will be devoted to the following aspects: modeling a data integration application, processing queries in data integration, dealing with inconsistent data sources, and reasoning on queries.
A software framework for matchmaking based on semantic web technology
, 2003
"... An important objective of the Semantic Web is to make Electronic Commerce interactions more flexible and automated. To achieve this, standardisation of ontologies, message content and message protocols will be necessary. In this paper we investigate how Semantic and Web Services technologies can be ..."
Abstract
-
Cited by 398 (5 self)
- Add to MetaCart
(Show Context)
An important objective of the Semantic Web is to make Electronic Commerce interactions more flexible and automated. To achieve this, standardisation of ontologies, message content and message protocols will be necessary. In this paper we investigate how Semantic and Web Services technologies can be used to support service advertisement and discovery in ecommerce. In particular, we describe the design and implementation of a service matchmaking prototype which uses a DAML-S based ontology and a Description Logic reasoner to compare ontology based service descriptions. By representing the semantics of service descriptions, the matchmaker enables the behaviour of an intelligent agent to approach more closely that of a human user trying to locate suitable web services. We also present the results of initial experiments testing the performance of this prototype implementation in a realistic agent based e-commerce scenario.
Swoogle: A search and metadata engine for the semantic web
- In Proceedings of the Thirteenth ACM Conference on Information and Knowledge Management
, 2004
"... Swoogle is a crawler-based indexing and retrieval system for the Semantic Web documents – i.e., RDF or OWL documents. It analyzes the documents it discovered to compute useful metadata properties and relationships between them. The documents are also indexed by using an information retrieval system ..."
Abstract
-
Cited by 284 (26 self)
- Add to MetaCart
(Show Context)
Swoogle is a crawler-based indexing and retrieval system for the Semantic Web documents – i.e., RDF or OWL documents. It analyzes the documents it discovered to compute useful metadata properties and relationships between them. The documents are also indexed by using an information retrieval system which can use either character N-Gram or URIs as terms to find documents matching a user’s query or to compute the similarity among a set of documents. One of the interesting properties computed for each Semantic Web document is its rank – a measure of the document’s importance on the Semantic Web. 1.
DL-Lite: Tractable description logics for ontologies
- In Proc. of AAAI 2005
, 2005
"... We propose a new Description Logic, called DL-Lite, specifically tailored to capture basic ontology languages, while keeping low complexity of reasoning. Reasoning here means not only computing subsumption between concepts, and checking satisfiability of the whole knowledge base, but also answering ..."
Abstract
-
Cited by 211 (49 self)
- Add to MetaCart
(Show Context)
We propose a new Description Logic, called DL-Lite, specifically tailored to capture basic ontology languages, while keeping low complexity of reasoning. Reasoning here means not only computing subsumption between concepts, and checking satisfiability of the whole knowledge base, but also answering complex queries (in particular, conjunctive queries) over the set of instances maintained in secondary storage. We show that in DL-Lite the usual DL reasoning tasks are polynomial in the size of the TBox, and query answering is polynomial in the size of the ABox (i.e., in data complexity). To the best of our knowledge, this is the first result of polynomial data complexity for query answering over DL knowledge bases. A notable feature of our logic is to allow for a separation between TBox and ABox reasoning during query evaluation: the part of the process requiring TBox reasoning is independent of the ABox, and the part of the process requiring access to the ABox can be carried out by an SQL engine, thus taking advantage of the query optimization strategies provided by current DBMSs.
The DL-Lite family and relations
- JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH (JAIR)
, 2009
"... The recently introduced series of description logics under the common moniker ‘DL-Lite ’ has attracted attention of the description logic and semantic web communities due to the low computational complexity of inference, on the one hand, and the ability to represent conceptual modeling formalisms, o ..."
Abstract
-
Cited by 201 (70 self)
- Add to MetaCart
The recently introduced series of description logics under the common moniker ‘DL-Lite ’ has attracted attention of the description logic and semantic web communities due to the low computational complexity of inference, on the one hand, and the ability to represent conceptual modeling formalisms, on the other. The main aim of this article is to carry out a thorough and systematic investigation of inference in extensions of the original DL-Lite logics along five axes: by (i) adding the Boolean connectives and (ii) number restrictions to concept constructs, (iii) allowing role hierarchies, (iv) allowing role disjointness, symmetry, asymmetry, reflexivity, irreflexivity and transitivity constraints, and (v) adopting or dropping the unique name assumption. We analyze the combined complexity of satisfiability for the resulting logics, as well as the data complexity of instance checking and answering positive existential queries. Our approach is based on embedding DL-Lite logics in suitable fragments of the one-variable first-order logic, which provides useful insights into their properties and, in particular, computational behavior.
Automated analysis of feature models 20 years later: A literature review
- INFORMATION SYSTEMS
, 2010
"... Software product line engineering is about producing a set of related products that share more commonalities than variabilities. Feature models are widely used for variability and commonality management in software product lines. Feature models are information models where a set of products are repr ..."
Abstract
-
Cited by 186 (20 self)
- Add to MetaCart
(Show Context)
Software product line engineering is about producing a set of related products that share more commonalities than variabilities. Feature models are widely used for variability and commonality management in software product lines. Feature models are information models where a set of products are represented as a set of features in a single model. The automated analysis of feature models deals with the computer–aided extraction of information from feature models. The literature on this topic has contributed with a set of operations, techniques, tools and empirical results which have not been surveyed until now. This paper provides a comprehensive literature review on the automated analysis of feature models 20 years after of their invention. This paper contributes by bringing together previously-disparate streams of work to help shed light on this thriving area. We also present a conceptual framework to understand the different proposals as well as categorise future contributions. We finally discuss the different studies and propose some challenges to be faced in the future.
Automatic Composition of e-Services that Export their Behavior
- IN 1ST INTL. CONFERENCE ON SERVICE ORIENTED COMPUTING
, 2003
"... The main focus of this paper is on automatic e-Service composition. We start by developing a framework in which the exported behavior of an e-Service is described in terms of its possible executions (execution trees). Then we specialize the framework to the case in which such exported behavior (i. ..."
Abstract
-
Cited by 180 (22 self)
- Add to MetaCart
(Show Context)
The main focus of this paper is on automatic e-Service composition. We start by developing a framework in which the exported behavior of an e-Service is described in terms of its possible executions (execution trees). Then we specialize the framework to the case in which such exported behavior (i.e., the execution tree of the e-Service) is represented by a finite state machine. In this specific setting, we analyze the complexity of synthesizing a composition, and develop sound and complete algorithms to check the existence of a composition and to return one such a composition if one exists. To the best of our knowledge, our work is the first attempt to provide an algorithm for the automatic synthesis of e-Service composition, that is both proved to be correct, and has an associated computational complexity characterization.
Modular Reuse of Ontologies: Theory and Practice
- JAIR
, 2008
"... In this paper, we propose a set of tasks that are relevant for the modular reuse of ontologies. In order to formalize these tasks as reasoning problems, we introduce the notions of conservative extension, safety and module for a very general class of logic-based ontology languages. We investigate th ..."
Abstract
-
Cited by 139 (22 self)
- Add to MetaCart
(Show Context)
In this paper, we propose a set of tasks that are relevant for the modular reuse of ontologies. In order to formalize these tasks as reasoning problems, we introduce the notions of conservative extension, safety and module for a very general class of logic-based ontology languages. We investigate the general properties of and relationships between these notions and study the relationships between the relevant reasoning problems we have previously identified. To study the computability of these problems, we consider, in particular, Description Logics (DLs), which provide the formal underpinning of the W3C Web Ontology Language (OWL), and show that all the problems we consider are undecidable or algorithmically unsolvable for the description logic underlying OWL DL. In order to achieve a practical solution, we identify conditions sufficient for an ontology to reuse a set of symbols “safely”—that is, without changing their meaning. We provide the notion of a safety class, which characterizes any sufficient condition for safety, and identify a family of safety classes–called locality—which enjoys a collection of desirable properties. We use the notion of a safety class to extract modules from ontologies, and we provide various modularization algorithms that are appropriate to the properties of the particular safety class in use. Finally, we show practical benefits of our safety checking and module extraction algorithms. 1.
Hypertableau Reasoning for Description Logics
- JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 2007
"... We present a novel reasoning calculus for the description logic SHOIQ + —a knowledge representation formalism with applications in areas such as the Semantic Web. Unnecessary nondeterminism and the construction of large models are two primary sources of inefficiency in the tableau-based reasoning ca ..."
Abstract
-
Cited by 132 (26 self)
- Add to MetaCart
(Show Context)
We present a novel reasoning calculus for the description logic SHOIQ + —a knowledge representation formalism with applications in areas such as the Semantic Web. Unnecessary nondeterminism and the construction of large models are two primary sources of inefficiency in the tableau-based reasoning calculi used in state-of-the-art reasoners. In order to reduce nondeterminism, we base our calculus on hypertableau and hyperresolution calculi, which we extend with a blocking condition to ensure termination. In order to reduce the size of the constructed models, we introduce anywhere pairwise blocking. We also present an improved nominal introduction rule that ensures termination in the presence of nominals, inverse roles, and number restrictions—a combination of DL constructs that has proven notoriously difficult to handle. Our implementation shows significant performance improvements over state-of-the-art reasoners on several well-known ontologies.