Results 1  10
of
1,024
Measuring Expectations
, 2004
"... This article discusses the history underlying the new literature, describes some of what has been learned thus far, and looks ahead towards making further progress ..."
Abstract

Cited by 249 (14 self)
 Add to MetaCart
This article discusses the history underlying the new literature, describes some of what has been learned thus far, and looks ahead towards making further progress
Game Theory, Maximum Entropy, Minimum Discrepancy And Robust Bayesian Decision Theory
 ANNALS OF STATISTICS
, 2004
"... ..."
Learning under Ambiguity
 Review of Economic Studies
, 2002
"... This paper considers learning when the distinction between risk and ambiguity matters. It first describes thought experiments, dynamic variants of those provided by Ellsberg, that highlight a sense in which the Bayesian learning model is extremeit models agents who are implausibly ambitious about w ..."
Abstract

Cited by 62 (7 self)
 Add to MetaCart
(Show Context)
This paper considers learning when the distinction between risk and ambiguity matters. It first describes thought experiments, dynamic variants of those provided by Ellsberg, that highlight a sense in which the Bayesian learning model is extremeit models agents who are implausibly ambitious about what they can learn in complicated environments. The paper then provides a generalization of the Bayesian model that accommodates the intuitive choices in the thought experiments. In particular, the model allows decisionmakers ’ confidence about the environment to change — along with beliefs — as they learn. A portfolio choice application compares the effect of changes in confidence under ambiguity versus changes in estimation risk under Bayesian learning. The former is shown to induce a trend towards more stock market participation and investment even when the latter does not. 1
Possibility theory and statistical reasoning
 Computational Statistics & Data Analysis Vol
, 2006
"... Numerical possibility distributions can encode special convex families of probability measures. The connection between possibility theory and probability theory is potentially fruitful in the scope of statistical reasoning when uncertainty due to variability of observations should be distinguished f ..."
Abstract

Cited by 59 (4 self)
 Add to MetaCart
(Show Context)
Numerical possibility distributions can encode special convex families of probability measures. The connection between possibility theory and probability theory is potentially fruitful in the scope of statistical reasoning when uncertainty due to variability of observations should be distinguished from uncertainty due to incomplete information. This paper proposes an overview of numerical possibility theory. Its aim is to show that some notions in statistics are naturally interpreted in the language of this theory. First, probabilistic inequalites (like Chebychev’s) offer a natural setting for devising possibility distributions from poor probabilistic information. Moreover, likelihood functions obey the laws of possibility theory when no prior probability is available. Possibility distributions also generalize the notion of confidence or prediction intervals, shedding some light on the role of the mode of asymmetric probability densities in the derivation of maximally informative interval substitutes of probabilistic information. Finally, the simulation of fuzzy sets comes down to selecting a probabilistic representation of a possibility distribution, which coincides with the Shapley value of the corresponding consonant capacity. This selection process is in agreement with Laplace indifference principle and is closely connected with the mean interval of a fuzzy interval. It sheds light on the “defuzzification ” process in fuzzy set theory and provides a natural definition of a subjective possibility distribution that sticks to the Bayesian framework of exchangeable bets. Potential applications to risk assessment are pointed out. 1
Artificial Reasoning with Subjective Logic
, 1997
"... This paper defines a framework for artificial reasoning called Subjective Logic, which consists of a belief model called opinion and set of operations for combining opinions. Subjective Logic is an extension of standard logic that uses continuous uncertainty and belief parameters instead of only ..."
Abstract

Cited by 59 (14 self)
 Add to MetaCart
(Show Context)
This paper defines a framework for artificial reasoning called Subjective Logic, which consists of a belief model called opinion and set of operations for combining opinions. Subjective Logic is an extension of standard logic that uses continuous uncertainty and belief parameters instead of only discrete truth values. It can also be seen as an extension of classical probability calculus by using a second order probability representation instead of the standard first order representation. In addition to the standard logical operations, Subjective Logic contains some operations specific for belief theory such as consensus and recommendation. In particular, we show that Dempster's consensus rule is inconsistent with Bayes' rule and therefore is wrong, and provide an alternative rule with a solid mathematical basis. Subjective Logic is directly compatible with traditional mathematical frameworks, but is also suitable for handling ignorance and uncertainty which is required in artificial...
Towards a unified theory of imprecise probability
 Int. J. Approx. Reasoning
, 2000
"... Belief functions, possibility measures and Choquet capacities of order 2, which are special kinds of coherent upper or lower probability, are amongst the most popular mathematical models for uncertainty and partial ignorance. I give examples to show that these models are not sufficiently general to ..."
Abstract

Cited by 56 (0 self)
 Add to MetaCart
(Show Context)
Belief functions, possibility measures and Choquet capacities of order 2, which are special kinds of coherent upper or lower probability, are amongst the most popular mathematical models for uncertainty and partial ignorance. I give examples to show that these models are not sufficiently general to represent some common types of uncertainty. Coherent lower previsions and sets of probability measures are considerably more general but they may not be sufficiently informative for some purposes. I discuss two other models for uncertainty, involving sets of desirable gambles and partial preference orderings. These are more informative and more general than the previous models, and they may provide a suitable mathematical setting for a unified theory of imprecise probability.
Data Fusion in the Transferable Belief Model.
, 2000
"... When Shafer introduced his theory of evidence based on the use of belief functions, he proposed a rule to combine belief functions induced by distinct pieces of evidence. Since then, theoretical justifications of this socalled Dempster's rule of combination have been produced and the meaning of ..."
Abstract

Cited by 52 (0 self)
 Add to MetaCart
When Shafer introduced his theory of evidence based on the use of belief functions, he proposed a rule to combine belief functions induced by distinct pieces of evidence. Since then, theoretical justifications of this socalled Dempster's rule of combination have been produced and the meaning of distinctness has been assessed. We will present practical applications where the fusion of uncertain data is well achieved by Dempster's rule of combination. It is essential that the meaning of the belief functions used to represent uncertainty be well fixed, as the adequacy of the rule depends strongly on a correct understanding of the context in which they are applied. Missing to distinguish between the upper and lower probabilities theory and the transferable belief model can lead to serious confusion, as Dempster's rule of combination is central in the transferable belief model whereas it hardly fits with the upper and lower probabilities theory. Keywords: belief function, transferable beli...