Results 1  10
of
42
Nonmonotonic Reasoning, Conditional Objects and Possibility Theory
 Artificial Intelligence
, 1997
"... . This short paper relates the conditional objectbased and possibility theorybased approaches for reasoning with conditional statements pervaded with exceptions, to other methods in nonmonotonic reasoning which have been independently proposed: namely, Lehmann's preferential and rational closu ..."
Abstract

Cited by 78 (22 self)
 Add to MetaCart
(Show Context)
. This short paper relates the conditional objectbased and possibility theorybased approaches for reasoning with conditional statements pervaded with exceptions, to other methods in nonmonotonic reasoning which have been independently proposed: namely, Lehmann's preferential and rational closure entailments which obey normative postulates, the infinitesimal probability approach, and the conditional (modal) logicsbased approach. All these methods are shown to be equivalent with respect to their capabilities for reasoning with conditional knowledge although they are based on different modeling frameworks. It thus provides a unified understanding of nonmonotonic consequence relations. More particularly, conditional objects, a purely qualitative counterpart to conditional probabilities, offer a very simple semantics, based on a 3valued calculus, for the preferential entailment, while in the purely ordinal setting of possibility theory both the preferential and the rational closure entai...
What Are Fuzzy Rules and How to Use Them
 Fuzzy Sets and Systems
, 1996
"... Fuzzy rules have been advocated as a key tool for expressing pieces of knowledge in "fuzzy logic". However, there does not exist a unique kind of fuzzy rules, nor is there only one type of "fuzzy logic". This diversity has caused many a misunderstanding in the literature of fuzzy ..."
Abstract

Cited by 58 (14 self)
 Add to MetaCart
(Show Context)
Fuzzy rules have been advocated as a key tool for expressing pieces of knowledge in "fuzzy logic". However, there does not exist a unique kind of fuzzy rules, nor is there only one type of "fuzzy logic". This diversity has caused many a misunderstanding in the literature of fuzzy control. The paper is a survey of different possible semantics for a fuzzy rule and shows how they can be captured in the framework of fuzzy set and possibility theory. It is pointed out that the interpretation of fuzzy rules dictates the way the fuzzy rules should be combined. The various kinds of fuzzy rules considered in the paper (gradual rules, certainty rules, possibility rules, and others) have different inference behaviors and correspond to various intended uses and applications. The representation of fuzzy unlessrules is briefly investigated on the basis of their intended meaning. The problem of defining and checking the coherence of a block of parallel fuzzy rules is also briefly addressed. This iss...
A systematic approach to the assessment of fuzzy association rules. Data Mining and Knowledge Discovery
, 2006
"... In order to allow for the analysis of data sets including numerical attributes, several generalizations of association rule mining based on fuzzy sets have been proposed in the literature. While the formal specification of fuzzy associations is more or less straightforward, the assessment of such ru ..."
Abstract

Cited by 43 (6 self)
 Add to MetaCart
(Show Context)
In order to allow for the analysis of data sets including numerical attributes, several generalizations of association rule mining based on fuzzy sets have been proposed in the literature. While the formal specification of fuzzy associations is more or less straightforward, the assessment of such rules by means of appropriate quality measures is less obvious. Particularly, it assumes an understanding of the semantic meaning of a fuzzy rule. This aspect has been ignored by most existing proposals, which must therefore be considered as adhoc to some extent. In this paper, we develop a systematic approach to the assessment of fuzzy association rules. To this end, we proceed from the idea of partitioning the data stored in a database into examples of a given rule, counterexamples, and irrelevant data. Evaluation measures are then derived from the cardinalities of the corresponding subsets. The problem of finding a proper partition has a rather obvious solution for standard association rules but becomes less trivial in the fuzzy case. Our results not only provide a sound justification for commonly used measures but also suggest a means for constructing meaningful alternatives. 1.
Probabilistic Default Reasoning with Conditional Constraints
 ANN. MATH. ARTIF. INTELL
, 2000
"... We present an approach to reasoning from statistical and subjective knowledge, which is based on a combination of probabilistic reasoning from conditional constraints with approaches to default reasoning from conditional knowledge bases. More precisely, we introduce the notions of , lexicographic, ..."
Abstract

Cited by 39 (18 self)
 Add to MetaCart
(Show Context)
We present an approach to reasoning from statistical and subjective knowledge, which is based on a combination of probabilistic reasoning from conditional constraints with approaches to default reasoning from conditional knowledge bases. More precisely, we introduce the notions of , lexicographic, and conditional entailment for conditional constraints, which are probabilistic generalizations of Pearl's entailment in system , Lehmann's lexicographic entailment, and Geffner's conditional entailment, respectively. We show that the new formalisms have nice properties. In particular, they show a similar behavior as referenceclass reasoning in a number of uncontroversial examples. The new formalisms, however, also avoid many drawbacks of referenceclass reasoning. More precisely, they can handle complex scenarios and even purely probabilistic subjective knowledge as input. Moreover, conclusions are drawn in a global way from all the available knowledge as a whole. We then show that the new formalisms also have nice general nonmonotonic properties. In detail, the new notions of , lexicographic, and conditional entailment have similar properties as their classical counterparts. In particular, they all satisfy the rationality postulates proposed by Kraus, Lehmann, and Magidor, and they have some general irrelevance and direct inference properties. Moreover, the new notions of  and lexicographic entailment satisfy the property of rational monotonicity. Furthermore, the new notions of , lexicographic, and conditional entailment are proper generalizations of both their classical counterparts and the classical notion of logical entailment for conditional constraints. Finally, we provide algorithms for reasoning under the new formalisms, and we analyze its computational com...
Representing partial ignorance
 IEEE Trans. on Systems, Man and Cybernetics
, 1996
"... Ignorance is precious, for once lost it can never be regained. This paper advocates the use of nonpurely probabilistic approaches to higherorder uncertainty. One of the major arguments of Bayesian probability proponents is that representing uncertainty is always decisiondriven and as a consequenc ..."
Abstract

Cited by 36 (11 self)
 Add to MetaCart
(Show Context)
Ignorance is precious, for once lost it can never be regained. This paper advocates the use of nonpurely probabilistic approaches to higherorder uncertainty. One of the major arguments of Bayesian probability proponents is that representing uncertainty is always decisiondriven and as a consequence, uncertainty should be represented by probability. Here we argue that representing partial ignorance is not always decisiondriven. Other reasoning tasks such as belief revision for instance are more naturally carried out at the purely cognitive level. Conceiving knowledge representation and decisionmaking as separate concerns opens the way to nonpurely probabilistic representations of incomplete knowledge. It is pointed out that within a numerical framework, two numbers are needed to account for partial ignorance about events, because on top of truth and falsity, the state of total ignorance must be encoded independently of the number of underlying alternatives. The paper also points out that it is consistent to accept a Bayesian view of decisionmaking and a nonBayesian view of knowledge representation because it is possible to map nonprobabilistic degrees of belief to betting probabilities when needed. Conditioning rules in nonBayesian settings are reviewed,
Qualitative decision theory: from Savage’s axioms to nonmonotonic reasoning
 JOURNAL OF THE ACM
, 2002
"... This paper investigates to what extent a purely symbolic approach to decision making under uncertainty is possible, in the scope of Artificial Intelligence. Contrary to classical approaches to decision theory, we try to rank acts without resorting to any numerical representation of utility nor unc ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
(Show Context)
This paper investigates to what extent a purely symbolic approach to decision making under uncertainty is possible, in the scope of Artificial Intelligence. Contrary to classical approaches to decision theory, we try to rank acts without resorting to any numerical representation of utility nor uncertainty, and without using any scale on which both uncertainty and preference could be mapped. Our approach is a variant of Savage's where the setting is finite, and the strict preference on acts is a partial order. It is shown that although many axioms of Savage theory are preserved and despite the intuitive appeal of the ordinal method for constructing a preference over acts, the approach is inconsistent with a probabilistic representation of uncertainty. The latter leads to the kind of paradoxes encountered in the theory of voting. It is shown that the assumption of ordinal invariance enforces a qualitative decision procedure that presupposes a comparative possibility representation of uncertainty, originally due to Lewis, and usual in nonmonotonic reasoning. Our axiomatic investigation thus provides decisiontheoretic foundations to preferential inference of Lehmann and colleagues. However, the obtained decision rules are sometimes either not very decisive or may lead to overconfident decisions, although their basic principles look sound. This paper points out some limitations of purely ordinal approaches to Savagelike decision making under uncertainty, in perfect analogy with similar difficulties in voting theory.
Probabilistic Logic under Coherence, ModelTheoretic Probabilistic Logic, and Default Reasoning
 Journal of Applied NonClassical Logics
"... We study probabilistic logic under the viewpoint of the coherence principle of de Finetti. In detail, we explore the relationship between coherencebased and modeltheoretic probabilistic logic. Interestingly, we show that the notions of gcoherence and of gcoherent entailment can be expressed by co ..."
Abstract

Cited by 24 (8 self)
 Add to MetaCart
(Show Context)
We study probabilistic logic under the viewpoint of the coherence principle of de Finetti. In detail, we explore the relationship between coherencebased and modeltheoretic probabilistic logic. Interestingly, we show that the notions of gcoherence and of gcoherent entailment can be expressed by combining notions in modeltheoretic probabilistic logic with concepts from default reasoning. Crucially, we even show that probabilistic reasoning under coherence is a probabilistic generalization of default reasoning in system P. That is, we provide a new probabilistic semantics for system P, which is neither based on infinitesimal probabilities nor on atomicbound (or also bigstepped) probabilities. These results also give new insight into default reasoning with conditional objects.
Weak nonmonotonic probabilistic logics
, 2004
"... Towards probabilistic formalisms for resolving local inconsistencies under modeltheoretic probabilistic entailment, we present probabilistic generalizations of Pearl’s entailment in System Z and Lehmann’s lexicographic entailment. We then analyze the nonmonotonic and semantic properties of the new ..."
Abstract

Cited by 24 (6 self)
 Add to MetaCart
(Show Context)
Towards probabilistic formalisms for resolving local inconsistencies under modeltheoretic probabilistic entailment, we present probabilistic generalizations of Pearl’s entailment in System Z and Lehmann’s lexicographic entailment. We then analyze the nonmonotonic and semantic properties of the new notions of entailment. In particular, we show that they satisfy the rationality postulates of System P and the property of Rational Monotonicity. Moreover, we show that modeltheoretic probabilistic entailment is stronger than the new notion of lexicographic entailment, which in turn is stronger than the new notion of entailment in System Z. As an important feature of the new notions of entailment in System Z and lexicographic entailment, we show that they coincide with modeltheoretic probabilistic entailment whenever there are no local inconsistencies. We also show that the new notions of entailment in System Z and lexicographic entailment are proper generalizations of their classical counterparts. Finally, we present algorithms for reasoning under the new formalisms, and we give a precise picture of its computational complexity.
Focusing vs. belief revision: A fundamental distinction when dealing with generic knowledge
 Proceedings 5th International Joint Conference on Qualitative and Quantitative Practical Reasoning, number 1244 in Lecture Notes in Artificial Intelligence
, 1997
"... : This paper advocates a basic distinction between two epistemic operations called focusing and revision, which can be defined in any, symbolic or numerical, representation framework which is rich enough for acknowledging the difference between factual evidence and generic knowledge. Revision amount ..."
Abstract

Cited by 21 (5 self)
 Add to MetaCart
(Show Context)
: This paper advocates a basic distinction between two epistemic operations called focusing and revision, which can be defined in any, symbolic or numerical, representation framework which is rich enough for acknowledging the difference between factual evidence and generic knowledge. Revision amounts to modifying the generic knowledge when receiving new pieces of generic knowledge (or the factual evidence when obtaining more factual information), while focusing is just applying the generic knowledge to the reference class of situations which exactly corresponds to all the available evidence gathered on the case under consideration. Various settings are considered, upper and lower probabilities, belief functions, numerical possibility measures, ordinal possibility measures, conditional objects, nonmonotonic consequence relations. 1  Introduction Some basic modes of belief change have been laid bare by Levi (1980): an expansion corresponds to adding the new piece of information withou...
PROBABILISTIC DESCRIPTION LOGICS FOR THE SEMANTIC WEB
, 2007
"... The work in this paper is directed towards sophisticated formalisms for reasoning under probabilistic uncertainty in ontologies in the Semantic Web. Ontologies play a central role in the development of the Semantic Web, since they provide a precise definition of shared terms in web resources. They ..."
Abstract

Cited by 20 (2 self)
 Add to MetaCart
The work in this paper is directed towards sophisticated formalisms for reasoning under probabilistic uncertainty in ontologies in the Semantic Web. Ontologies play a central role in the development of the Semantic Web, since they provide a precise definition of shared terms in web resources. They are expressed in the standardized web ontology language OWL, which consists of the three increasingly expressive sublanguages OWL Lite, OWL DL, and OWL Full. The sublanguages OWL Lite and OWL DL have a formal semantics and a reasoning support through a mapping to the expressive description logics SHIF(D) and SHOIN(D), respectively. In this paper, we present the expressive probabilistic description logics PSHIF(D) and PSHOIN(D), which are probabilistic extensions of these description logics. They allow for expressing rich terminological probabilistic knowledge about concepts and roles as well as assertional probabilistic knowledge about instances of concepts and roles. They are semantically based on the notion of probabilistic lexicographic entailment from probabilistic default reasoning, which naturally interprets this terminological and assertional probabilistic knowledge as knowledge about random and concrete instances, respectively. As an important additional feature, they also allow for expressing terminological default knowledge, which is semantically interpreted as in Lehmann’s lexicographic