Results 1  10
of
592
A rational analysis of the selection task as optimal data selection
 67 – 215535 Deliverable 4.1
, 1994
"... Human reasoning in hypothesistesting tasks like Wason's (1966, 1968) selection task has been depicted as prone to systematic biases. However, performance on this task has been assessed against a now outmoded falsificationist philosophy of science. Therefore, the experimental data is reassessed ..."
Abstract

Cited by 247 (16 self)
 Add to MetaCart
(Show Context)
Human reasoning in hypothesistesting tasks like Wason's (1966, 1968) selection task has been depicted as prone to systematic biases. However, performance on this task has been assessed against a now outmoded falsificationist philosophy of science. Therefore, the experimental data is reassessed in the light of a Bayesian model of optimal data selection in inductive hypothesis testing. The model provides a rational analysis (Anderson, 1990) of the selection task that fits well with people's performance on both abstract and thematic versions of the task. The model suggests that reasoning in these tasks may be rational rather than subject to systematic bias. Over the past 30 years, results in the psychology of reasoning have raised doubts about human rationality. The assumption of human rationality has a long history. Aristotle took the capacity for rational thought to be the defining characteristic of human beings, the capacity that separated us from the animals. Descartes regarded the ability to use language and to reason as the hallmarks of the mental that separated it from the merely physical. Many contemporary philosophers of mind also appeal to a basic principle of rationality in accounting for everyday, folk psychological explanation whereby we explain each other's behavior in terms of our beliefs and desires (Cherniak, 1986; Cohen, 1981; Davidson, 1984; Dennett, 1987; but see Stich, 1990). These philosophers, both ancient and modern, share a common view of rationality: To be rational is to reason according to rules (Brown, 1989). Logic and mathematics provide the normative rules that tell us how we should reason. Rationality therefore seems to demand that the human cognitive system embodies the rules of logic and mathematics. However, results in the psychology of reasoning appear to show that people do not reason according to these rules. In both deductive (Evans, 1982, 1989;
Weighing Risk and Uncertainty
, 1995
"... Decision theory distinguishes between risky prospects, where the probabilities associated with the possible outcomes are assumed to be known, and uncertain prospects, where these probabilities are not assumed to be known. Studies of choice between risky prospects have suggested a nonlinear transform ..."
Abstract

Cited by 185 (10 self)
 Add to MetaCart
Decision theory distinguishes between risky prospects, where the probabilities associated with the possible outcomes are assumed to be known, and uncertain prospects, where these probabilities are not assumed to be known. Studies of choice between risky prospects have suggested a nonlinear transformation of the probability scale that overweights low probabilities and underweights moderate and high probabilities. The present article extends this notion from risk to uncertainty by invoking the principle of bounded subadditivity: An event has greater impact when it turns impossibility into possibility, or possibility into certainty, than when it merely makes a possibility more or less likely. A series of studies provides support for this principle in decision under both risk and uncertainty and shows that people are less sensitive to uncertainty than to risk. Finally, the article discusses the relationship between probability judgments and decision weights and distinguishes relative sensitivity from ambiguity aversion.
Rationality For Economists?
 JOURNAL OF RISK AND UNCERTAINTY
, 1998
"... Rationality is a complex behavioral theory that can be parsed into statements about preferences, perceptions, and process. This paper looks at the evidence on rationality that is provided by behavioral experiments, and argues that most cognitive anomalies operate through errors in perception that a ..."
Abstract

Cited by 125 (6 self)
 Add to MetaCart
Rationality is a complex behavioral theory that can be parsed into statements about preferences, perceptions, and process. This paper looks at the evidence on rationality that is provided by behavioral experiments, and argues that most cognitive anomalies operate through errors in perception that arise from the way information is stored, retrieved, and processed, or through errors in process that lead to formulation of choice problems as cognitive tasks that are inconsistent at least with rationality narrowly defined. The paper discusses how these cognitive anomalies influence economic behavior and measurement, and their implications for economic analysis.
Betting on Theories
, 1993
"... Predictions about the future and unrestricted universal generalizations are never logically implied by our observational evidence, which is limited to particular facts in the present and past. Nevertheless, propositions of these and other kinds are often said to be confirmed by observational evidenc ..."
Abstract

Cited by 104 (4 self)
 Add to MetaCart
Predictions about the future and unrestricted universal generalizations are never logically implied by our observational evidence, which is limited to particular facts in the present and past. Nevertheless, propositions of these and other kinds are often said to be confirmed by observational evidence. A natural place to begin the study of confirmation theory is to consider what it means to say that some evidence E confirms a hypothesis H. Incremental and absolute confirmation Let us say that E raises the probability of H if the probability of H given E is higher than the probability of H not given E. According to many confirmation theorists, “E confirms H ” means that E raises the probability of H. This conception of confirmation will be called incremental confirmation. Let us say that H is probable given E if the probability of H given E is above some threshold. (This threshold remains to be specified but is assumed to be at least one half.) According to some confirmation theorists, “E confirms H ” means that H is probable given E. This conception of confirmation will be called absolute confirmation. Confirmation theorists have sometimes failed to distinguish these two concepts. For example, Carl Hempel in his classic “Studies in the Logic of Confirmation ” endorsed the following principles: (1) A generalization of the form “All F are G ” is confirmed by the evidence that there is an individual that is both F and G. (2) A generalization of that form is also confirmed by the evidence that there is an individual that is neither F nor G. (3) The hypotheses confirmed by a piece of evidence are consistent with one another. (4) If E confirms H then E confirms every logical consequence of H. Principles (1) and (2) are not true of absolute confirmation. Observation of a single thing that is F and G cannot in general make it probable that all F are G; likewise for an individual that is neither
The estimation of probabilities
, 1965
"... By way of introduction, a classification of kinds of probability is given in the form of a tree which also forms an approximate hierarchy: psychological, subjective, logical, physical, and tautological. Various relationships between these kinds of probability are mentioned. Methods, all more or less ..."
Abstract

Cited by 93 (2 self)
 Add to MetaCart
By way of introduction, a classification of kinds of probability is given in the form of a tree which also forms an approximate hierarchy: psychological, subjective, logical, physical, and tautological. Various relationships between these kinds of probability are mentioned. Methods, all more or less Bayesian, for the estimation of physical probabilities are then described. Binomial and multinomial probabilities are estimated by means of a threetiered hierarchical Bayesian method. The method can also be regarded, in some of its aspects, as Bayesian in the ordinary sense, wherein the initial distribution for the physical probabilities is a weighted sum of symmetrical Dirichlet distributions. It can be proved that this is equivalent to the use of a single symmetrical Dirichlet distribution whose parameter is selected after sampling. Thus, in this problem, an ordinary Bayesian method implies an empirical Bayesian method. Next the speciessampling or vocabularysampling problem is considered wherein there is a multinomial population having a very large number of categories. The theory derives from a suggestion of Turing's, which in part anticipates the empirical Bayesian method. Among other things, it leads to estimates of population coverage for an enlarged sample. Finally the estimation of probabilities in multidimensional contingency tables is considered. The main method here is that of maximum entropy, but it is shown that this can be subsumed under a more general method of minimum discriminability for the formulation of hypotheses. Entropy is best regarded as a special case of the older and more obviously Bayesian concept of "expected weight of evidence". 1.
Risk attitudes and decision weights
 Econometrica
, 1995
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract

Cited by 88 (7 self)
 Add to MetaCart
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
Reconciling simplicity and likelihood principles in perceptual organization
 Psychological Review
, 1996
"... Two principles of perceptual organization have been proposed. The likelihood principle, following H. L. E yon Helmholtz ( 1910 / 1962), proposes that perceptual organization is chosen to correspond to the most likely distal layout. The simplicity principle, following Gestalt psychology, suggests tha ..."
Abstract

Cited by 87 (17 self)
 Add to MetaCart
(Show Context)
Two principles of perceptual organization have been proposed. The likelihood principle, following H. L. E yon Helmholtz ( 1910 / 1962), proposes that perceptual organization is chosen to correspond to the most likely distal layout. The simplicity principle, following Gestalt psychology, suggests that perceptual organization is chosen to be as simple as possible. The debate between these two views has been a central topic in the study of perceptual organization. Drawing on mathematical results in A. N. Kolmogorov's ( 1965)complexity heory, the author argues that simplicity and likelihood are not in competition, but are identical. Various implications for the theory of perceptual organization and psychology more generally are outlined. How does the perceptual system derive a complex and structured description of the perceptual world from patterns of activity at the sensory receptors? Two apparently competing theories of perceptual organization have been influential. The first, initiated by Helmholtz ( 1910/1962), advocates the likelihood principle: Sensory input will be organized into the most probable distal object or event consistent with that input. The second, initiated by Wertheimer and developed by other Gestalt psychologists, advocates what Pomerantz and Kubovy (1986) called the simplicity principle: The perceptual system is viewed as finding the simplest, rather than the most likely, perceptual organization consistent with the sensory input '. There has been considerable theoretical nd empirical controversy concerning whether likelihood or simplicity is the governing principle of perceptual organization (e.g., Hatfield, &
Choice under uncertainty with the best and worst in mind: NEOadditive capacities, mimeo, Universität
, 2005
"... Abstract We develop the simplest generalization of subjective expected utility that can accommodate both optimistic and pessimistic attitudes towards uncertaintyChoquet expected utility with nonextremeoutcomeadditive (neoadditive) capacities.A neoadditive capacity can be expressed as the conv ..."
Abstract

Cited by 72 (15 self)
 Add to MetaCart
Abstract We develop the simplest generalization of subjective expected utility that can accommodate both optimistic and pessimistic attitudes towards uncertaintyChoquet expected utility with nonextremeoutcomeadditive (neoadditive) capacities.A neoadditive capacity can be expressed as the convex combination of a probability and a special capacity, we refer to as a Hurwicz capacity, that only distinguishes between whether an event is impossible, possible or certain. We show that neoadditive capacities can be readily applied in economic problems, and we provide an axiomatization in a framework of purely subjective uncertainty.
Beware of samples! A cognitiveecological sampling approach to judgment biases
 Psychological Review
, 2000
"... A cognitiveecological approach to judgment biases is presented and substantiated by recent empirical evidence. Latent properties of the environment are not amenable to direct assessment but have to be inferred from empirical samples that provide the interface between cognition and the environment. ..."
Abstract

Cited by 62 (2 self)
 Add to MetaCart
(Show Context)
A cognitiveecological approach to judgment biases is presented and substantiated by recent empirical evidence. Latent properties of the environment are not amenable to direct assessment but have to be inferred from empirical samples that provide the interface between cognition and the environment. The sampling process may draw on the external world or on internal memories. For systematic reasons (proximity, salience, and focus of attention), the resulting samples tend to be biased (selective, skewed, or conditional on information search strategies). Because people lack the metacognitive ability to understand and control for sampling constraints (predictor sampling, criterion sampling, selectiveoutcome sampling, etc.), the sampling biases carry over to subsequent judgments. Within this framework, alternative accounts are offered for a number of judgment biases, such as baserate neglect, confirmation bias, illusory correlation, pseudocontingency, Simpson's paradox, outgroup devaluation, and pragmaticconfusion effects. It is not unusual for psychological theories to draw on common physical or statistical metaphors (Gigerenzer, 1991; Roediger, 1980), such as Lewin's (1951) field theory, Brunswik's (1956) lens model, Thurstone's (1927) law of comparative judgment and
Towards a unified theory of imprecise probability
 Int. J. Approx. Reasoning
, 2000
"... Belief functions, possibility measures and Choquet capacities of order 2, which are special kinds of coherent upper or lower probability, are amongst the most popular mathematical models for uncertainty and partial ignorance. I give examples to show that these models are not sufficiently general to ..."
Abstract

Cited by 62 (0 self)
 Add to MetaCart
(Show Context)
Belief functions, possibility measures and Choquet capacities of order 2, which are special kinds of coherent upper or lower probability, are amongst the most popular mathematical models for uncertainty and partial ignorance. I give examples to show that these models are not sufficiently general to represent some common types of uncertainty. Coherent lower previsions and sets of probability measures are considerably more general but they may not be sufficiently informative for some purposes. I discuss two other models for uncertainty, involving sets of desirable gambles and partial preference orderings. These are more informative and more general than the previous models, and they may provide a suitable mathematical setting for a unified theory of imprecise probability.