Results 1  10
of
39
In Defense of Probability
 In Proceedings of the Ninth International Joint Conference on Artificial Intelligence
, 1985
"... In this paper, it is argued that probability theory, when used correctly, is sufficient for the task of reasoning under uncertainty. Since numerous authors have rejected probability as inadequate for various reasons, the bulk of the paper is aimed at refuting these claims and indicating the sources ..."
Abstract

Cited by 88 (0 self)
 Add to MetaCart
(Show Context)
In this paper, it is argued that probability theory, when used correctly, is sufficient for the task of reasoning under uncertainty. Since numerous authors have rejected probability as inadequate for various reasons, the bulk of the paper is aimed at refuting these claims and indicating the sources of error. In particular, the definition of probability as a measure of belief rather than a frequency ratio is advocated, since a frequency interpretation of probability drastically restricts the domain applicability. Other sources of error include the confusion between relative and absolute probability, the distinction between probability and the uncertainty of that probability. Also, the interaction of logic and probability is discussed and it is argued that many extensions of logic, such as "default logic" are better understood in a probabilistic framework. The main claim of this paper is that the numerous schemes for representing and reasoning about uncertainty that have appeared in the AI literature are unnecessary  probability is all that is needed.
Can the Maximum Entropy Principle Be Explained as a Consistency Requirement?
, 1997
"... The principle of maximumentropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathe ..."
Abstract

Cited by 33 (1 self)
 Add to MetaCart
The principle of maximumentropy is a general method to assign values to probability distributions on the basis of partial information. This principle, introduced by Jaynes in 1957, forms an extension of the classical principle of insufficient reason. It has been further generalized, both in mathematical formulation and in intended scope, into the principle of maximum relative entropy or of minimum information. It has been claimed that these principles are singled out as unique methods of statistical inference that agree with certain compelling consistency requirements. This paper reviews these consistency arguments and the surrounding controversy. It is shown that the uniqueness proofs are flawed, or rest on unreasonably strong assumptions. A more general class of 1 inference rules, maximizing the socalled R'enyi entropies, is exhibited which also fulfill the reasonable part of the consistency assumptions. 1 Introduction In any application of probability theory to the pro...
The Promise of Bayesian Inference for Astrophysics
, 1992
"... . The `frequentist' approach to statistics, currently dominating statistical practice in astrophysics, is compared to the historically older Bayesian approach, which is now growing in popularity in other scientific disciplines, and which provides unique, optimal solutions to wellposed problems ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
. The `frequentist' approach to statistics, currently dominating statistical practice in astrophysics, is compared to the historically older Bayesian approach, which is now growing in popularity in other scientific disciplines, and which provides unique, optimal solutions to wellposed problems. The two approaches address the same questions with very different calculations, but in simple cases often give the same final results, confusing the issue of whether one is superior to the other. Here frequentist and Bayesian methods are applied to problems where such a mathematical coincidence does not occur, allowing assessment of their relative merits based on their performance, rather than on philosophical argument. Emphasis is placed on a key distinction between the two approaches: Bayesian methods, based on comparisons among alternative hypotheses using the single observed data set, consider averages over hypotheses; frequentist methods, in contrast, average over hypothetical alternative...
Erratum: Billiards in a general domain with random reflections
, 2009
"... We study stochastic billiards on general tables: a particle moves according to its constant velocity inside some domain D ⊂ R d until it hits the boundary and bounces randomly inside according to some reflection law. We assume that the boundary of the domain is locally Lipschitz and almost everywher ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
(Show Context)
We study stochastic billiards on general tables: a particle moves according to its constant velocity inside some domain D ⊂ R d until it hits the boundary and bounces randomly inside according to some reflection law. We assume that the boundary of the domain is locally Lipschitz and almost everywhere continuously differentiable. The angle of the outgoing velocity with the inner normal vector has Partially supported by CNRS (UMR 7599 “Probabilités et Modèles Aléatoires”)
Bias toward regular form in mental shape spaces
 Journal of Experimental Psychology: Human Perception & Performance
, 2000
"... The distribution of figural "goodness " in 2 mental shape spaces, the space of triangles and the space of quadrilaterals, was examined. In Experiment 1, participants were asked to rate the typicality of visually presented triangles and quadrilaterals (perceptual task). In Experiment 2, par ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
(Show Context)
The distribution of figural "goodness " in 2 mental shape spaces, the space of triangles and the space of quadrilaterals, was examined. In Experiment 1, participants were asked to rate the typicality of visually presented triangles and quadrilaterals (perceptual task). In Experiment 2, participants were asked to draw triangles and quadrilaterals by hand (production task). The rated typicality of a particular shape and the probability that that shape was generated by participants were each plotted as a function of shape parameters, yielding estimates of the subjective distribution of shape goodness in shape space. Compared with neutral distributions of random shapes in the same shape spaces, these distributions showed a marked bias toward regular forms (equilateral triangles and squares). Such psychologically medal shapes apparently represent ideal forms that maximize the perceptual preference for regularity and symmetry. Shape classification, like many classification tasks, can be regarded as a decision among somewhat fuzzy categories. In
Ignorance and Indifference
, 2006
"... The epistemic state of complete ignorance is not a probability distribution. In it, we assign the same, unique ignorance degree of belief to any contingent outcome and each of its contingent, disjunctive parts. That this is the appropriate way to represent complete ignorance is established by two in ..."
Abstract

Cited by 13 (8 self)
 Add to MetaCart
The epistemic state of complete ignorance is not a probability distribution. In it, we assign the same, unique ignorance degree of belief to any contingent outcome and each of its contingent, disjunctive parts. That this is the appropriate way to represent complete ignorance is established by two instruments, each individually strong enough to identify this state. They are the principle of indifference (“PI”) and the notion that ignorance is invariant under certain redescriptions of the outcome space, here developed into the “principle of invariance of ignorance ” (“PII”). Both instruments are so innocuous as almost to be platitudes. Yet the literature in probabilistic epistemology has misdiagnosed them as paradoxical or defective since they generate inconsistencies when conjoined with the assumption that an epistemic state must be a probability distribution. To underscore the need to drop this assumption, I express PII in its most defensible form as relating symmetric descriptions and show that paradoxes still arise if we assume the ignorance state to be a probability distribution. 1.
Countable Additivity and Subjective Probability
 British Journal for the Philosophy of Science
, 1999
"... While there are several arguments on either side, it is far from clear as to whether or not countable additivity is an acceptable axiom of subjective probability. I focus here on de Finetti's central argument against countable additivity and provide a new Dutch book proof of the principle, to a ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
While there are several arguments on either side, it is far from clear as to whether or not countable additivity is an acceptable axiom of subjective probability. I focus here on de Finetti's central argument against countable additivity and provide a new Dutch book proof of the principle, to argue that if we accept the Dutch book foundations of subjective probability, countable additivity is an unavoidable constraint.
A Lewis Carroll pillow problem: probability of an obtuse triangle, Statist
 279–284; MR1293297 (95h:60003). Uniform Triangles with Equality Constraints 17
, 1994
"... ..."
Philosophies of probability: objective Bayesianism and its challenges
 Handbook of the philosophy of mathematics. Elsevier, Amsterdam. Handbook of the Philosophy of Science
, 2004
"... This chapter presents an overview of the major interpretations of probability followed by an outline of the objective Bayesian interpretation and a discussion of the key challenges it faces. ..."
Abstract

Cited by 10 (5 self)
 Add to MetaCart
This chapter presents an overview of the major interpretations of probability followed by an outline of the objective Bayesian interpretation and a discussion of the key challenges it faces.