Results 1  10
of
107
Learning Trees and Rules with Setvalued Features
, 1996
"... In most learning systems examples are represented as fixedlength "feature vectors", the components of which are either real numbers or nominal values. We propose an extension of the featurevector representation that allows the value of a feature to be a set of strings; for instance, to re ..."
Abstract

Cited by 210 (2 self)
 Add to MetaCart
In most learning systems examples are represented as fixedlength "feature vectors", the components of which are either real numbers or nominal values. We propose an extension of the featurevector representation that allows the value of a feature to be a set of strings; for instance, to represent a small white and black dog with the nominal features size and species and the setvalued feature color, one might use a feature vector with size=small, species=canisfamiliaris and color=fwhite,blackg. Since we make no assumptions about the number of possible set elements, this extension of the traditional featurevector representation is closely connected to Blum's "infinite attribute" representation. We argue that many decision tree and rule learning algorithms can be easily extended to setvalued features. We also show by example that many realworld learning problems can be efficiently and naturally represented with setvalued features; in particular, text categorization problems and probl...
Computing Least Common Subsumers in Description Logics with Existential Restrictions
, 1999
"... Computing the least common subsumer (lcs) is an inference task that can be used to support the "bottomup " construction of knowledge bases for KR systems based on description logics. Previous work on how to compute the lcs has concentrated on description logics that allow for univ ..."
Abstract

Cited by 119 (29 self)
 Add to MetaCart
Computing the least common subsumer (lcs) is an inference task that can be used to support the &quot;bottomup &quot; construction of knowledge bases for KR systems based on description logics. Previous work on how to compute the lcs has concentrated on description logics that allow for universal value restrictions, but not for existential restrictions. The main new contribution of this paper is the treatment of description logics with existential restrictions. Our approach for computing the lcs is based on an appropriate representation of concept descriptions by certain trees, and a characterization of subsumption by homomorphisms between these trees. The lcs operation then corresponds to the product operation on trees.
Least Common Subsumers and Most Specific Concepts in a Description Logic with Existential Restrictions and Terminological Cycles
, 2003
"... Computing least common subsumers (Ics) and most specific concepts (msc) are inference tasks that can support the bottomup construction of knowledge bases in description logics. In description logics with existential restrictions, the most specific concept need not exist if one restricts the attenti ..."
Abstract

Cited by 94 (18 self)
 Add to MetaCart
Computing least common subsumers (Ics) and most specific concepts (msc) are inference tasks that can support the bottomup construction of knowledge bases in description logics. In description logics with existential restrictions, the most specific concept need not exist if one restricts the attention to concept descriptions or acyclic TBoxes. In this paper, we extend the notions les and msc to cyclic TBoxes. For the description logic EC (which allows for conjunctions, existential restrictions, and the topconcept), we show that the les and msc always exist and can be computed in polynomial time if we interpret cyclic definitions with greatest fixpoint semantics.
Computing the Least Common Subsumer w.r.t. a Background Terminology
 Journal of Applied Logic
, 2004
"... Methods for computing the least common subsumer (lcs) are usually restricted to rather inexpressive DLs whereas existing knowledge bases are written in very expressive DLs. In order to allow the user to reuse concepts defined in such terminologies and still support the definition of new concepts ..."
Abstract

Cited by 50 (10 self)
 Add to MetaCart
(Show Context)
Methods for computing the least common subsumer (lcs) are usually restricted to rather inexpressive DLs whereas existing knowledge bases are written in very expressive DLs. In order to allow the user to reuse concepts defined in such terminologies and still support the definition of new concepts by computing the lcs, we extend the notion of the lcs of concept descriptions to the notion of the lcs w.r.t. a background terminology.
Rewriting concepts using terminologies
 Proceedings of the Seventh International Conference on Knowledge Representation and Reasoning (KR2000
, 2000
"... The problem of rewriting a concept given a terminology can informally be stated as follows: given a terminology T (i.e., a set of concept definitions) and a concept description C that does not contain concept names defined in T, can this description be rewritten into a "related better & ..."
Abstract

Cited by 45 (6 self)
 Add to MetaCart
(Show Context)
The problem of rewriting a concept given a terminology can informally be stated as follows: given a terminology T (i.e., a set of concept definitions) and a concept description C that does not contain concept names defined in T, can this description be rewritten into a &quot;related better &quot; description E by using (some of) the names defined in T? In this paper, we first introduce a general framework for the rewriting problem in description logics, and then concentrate on one specific instance of the framework, namely the minimal rewriting problem (where &quot;better &quot; means shorter, and &quot;related &quot; means equivalent). We investigate the complexity of the decision problem induced by the minimal rewriting problem for the languages FL 0, ALN, ALE, and ALC, and then introduce an algorithm for computing (minimal) rewritings for the language ALE. (In the full paper, a similar algorithm is also developed for ALN.) Finally, we sketch other interesting instances of the framework.
SpecifictoGeneral Learning for Temporal Events with Application to Learning . . .
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 2002
"... We develop, analyze, and evaluate a novel, supervised, specifictogeneral learner for a simple temporal logic and use the resulting algorithm to learn visual event definitions from video sequences. First, we introduce a simple, propositional, temporal, eventdescription language called AMA that ..."
Abstract

Cited by 36 (4 self)
 Add to MetaCart
(Show Context)
We develop, analyze, and evaluate a novel, supervised, specifictogeneral learner for a simple temporal logic and use the resulting algorithm to learn visual event definitions from video sequences. First, we introduce a simple, propositional, temporal, eventdescription language called AMA that is sufficiently expressive to represent many events yet sufficiently restrictive to support learning. We then give algorithms, along with lower and upper complexity bounds, for the subsumption and generalization problems for AMA formulas. We present a positiveexamples  only specifictogeneral learning method based on these algorithms. We also present a polynomialtime  computable "syntactic" subsumption test that implies semantic subsumption without being equivalent to it. A generalization algorithm based on syntactic subsumption can be used in place of semantic generalization to improve the asymptotic complexity of the resulting learning algorithm. Finally
A refinement operator based learning algorithm for the ALC description logic
, 2007
"... Abstract With the advent of the Semantic Web, description logics have become one of the most prominent paradigms for knowledge representation and reasoning. Progress in research and applications, however, faces a bottleneck due to the lack of available knowledge bases, and it is paramount that suita ..."
Abstract

Cited by 33 (15 self)
 Add to MetaCart
(Show Context)
Abstract With the advent of the Semantic Web, description logics have become one of the most prominent paradigms for knowledge representation and reasoning. Progress in research and applications, however, faces a bottleneck due to the lack of available knowledge bases, and it is paramount that suitable automated methods for their acquisition will be developed. In this paper, we provide the first learning algorithm based on refinement operators for the most fundamental description logic ALC. We develop the algorithm from thorough theoretical foundations and report on a prototype implementation. 1
CLASSIC Learning
 In Proceedings of the Seventh Annual ACM Conference on Computational Learning Theory
, 1991
"... . Description logics, also called terminological logics, are commonly used in knowledgebased systems to describe objects and their relationships. We investigate the learnability of a typical description logic, Classic, and show that Classic sentences are learnable in polynomial time in the exact lea ..."
Abstract

Cited by 32 (1 self)
 Add to MetaCart
. Description logics, also called terminological logics, are commonly used in knowledgebased systems to describe objects and their relationships. We investigate the learnability of a typical description logic, Classic, and show that Classic sentences are learnable in polynomial time in the exact learning model using equivalence queries and membership queries (which are in essence, "subsumption queries"we show a prediction hardness result for the more traditional membership queries that convey information about specific individuals). We show that membership queries alone are insufficient for polynomial time learning of Classic sentences. Combined with earlier negative results (Cohen & Hirsh, 1994a) showing that, given standard complexity theoretic assumptions, equivalence queries alone are insufficient (or random examples alone in the PAC setting are insufficient), this shows that both sources of information are necessary for efficient learning in that neither type alone is sufficie...
Statistical Schema Induction
 In Proceedings of the 8th Extended Semantic Web Conference: Linked Open Data Track
, 2011
"... Abstract. While the realization of the Semantic Web as once envisioned by Tim BernersLee remains in a distant future, the Web of Data has already become a reality. Billions of RDF statements on the Internet, facts about a variety of different domains, are ready to be used by semantic applications. ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
(Show Context)
Abstract. While the realization of the Semantic Web as once envisioned by Tim BernersLee remains in a distant future, the Web of Data has already become a reality. Billions of RDF statements on the Internet, facts about a variety of different domains, are ready to be used by semantic applications. Some of these applications, however, crucially hinge on the availability of expressive schemas suitable for logical inference that yields nontrivial conclusions. In this paper, we present a statistical approach to the induction of expressive schemas from large RDF repositories. We describe in detail the implementation of this approach and report on an evaluation that we conducted using several data sets including DBpedia.