Results 1 - 10
of
769
A distributed, developmental model of word recognition and naming
- PSYCHOLOGICAL REVIEW
, 1989
"... A parallel distributed processing model of visual word recognition and pronunciation is described. The model consists of sets of orthographic and phonological units and an interlevel of hidden units. Weights on connections between units were modified during a training phase using the back-propagatio ..."
Abstract
-
Cited by 706 (49 self)
- Add to MetaCart
(Show Context)
A parallel distributed processing model of visual word recognition and pronunciation is described. The model consists of sets of orthographic and phonological units and an interlevel of hidden units. Weights on connections between units were modified during a training phase using the back-propagation learning algorithm. The model simulates many aspects of human performance, including (a) differences between words in terms of processing difficulty, (b) pronunciation of novel items, (c) differences between readers in terms of word recognition skill, (d) transitions from beginning to skilled reading, and (e) differences in performance on lexieal decision and naming tasks. The model's behavior early in the learning phase corresponds to that of children acquiring word recognition skills. Training with a smaller number of hidden units produces output characteristic of many dys-lexic readers. Naming is simulated without pronunciation rules, and lexical decisions are simulated without accessing word-level representations. The performance of the model is largely determined by three factors: the nature of the input, a significant fragment of written English; the learning rule, which encodes the implicit structure of the orthography in the weights on connections; and the architecture of the system, which influences the scope of what can be learned. The recognition and pronunciation of words is one of the cen-
Halfa century of research on the Stroop effect: An integrative review
- PsychologicalBulletin
, 1991
"... The literature on interference in the Stroop Color-Word Task, covering over 50 years and some 400 studies, is organized and reviewed. In so doing, a set ofl 8 reliable empirical findings is isolated that must be captured by any successful theory of the Stroop effect. Existing theoretical positions a ..."
Abstract
-
Cited by 666 (14 self)
- Add to MetaCart
(Show Context)
The literature on interference in the Stroop Color-Word Task, covering over 50 years and some 400 studies, is organized and reviewed. In so doing, a set ofl 8 reliable empirical findings is isolated that must be captured by any successful theory of the Stroop effect. Existing theoretical positions are summarized and evaluated in view of this critical evidence and the 2 major candidate theories--relative speed of processing and automaticity of reading--are found to be wanting. It is concluded that recent theories placing the explanatory weight on parallel processing of the irrelevant and the relevant dimensions are likely to be more successful than are earlier theories attempting to locate a single bottleneck in attention. In 1935, J. R. Stroop published his landmark article on attention and interference, an article more influential now than it was then. Why has the Stroop task continued to fascinate us? Perhaps the task is seen as tapping into the primitive operations of cognition, offering clues to the fundamental process of attention. Perhaps the robustness of the phenomenon provides a special challenge to decipher. Together these are powerful attractions
Toward an instance theory of automatization
- Psychological Review
, 1988
"... This article presents a theory in which automatization is construed as the acquisition of a domain-specific knowledge base, formed of separate representations, instances, of each exposure to the task. Processing is considered automatic if it relies on retrieval of stored instances, which will occur ..."
Abstract
-
Cited by 647 (38 self)
- Add to MetaCart
(Show Context)
This article presents a theory in which automatization is construed as the acquisition of a domain-specific knowledge base, formed of separate representations, instances, of each exposure to the task. Processing is considered automatic if it relies on retrieval of stored instances, which will occur only after practice in a consistent environment. Practice is important because it increases the amount retrieved and the speed of retrieval; consistency is important because it ensures that the retrieved instances will be useful. The theory accounts quantitatively for the power-function speed-up and predicts a power-function reduction in the standard deviation that is constrained to have the same exponent as the power function for the speed-up. The theory accounts for qualitative properties as well, explaining how some may disappear and others appear with practice. More generally, it provides an alternative to the modal view of automaticity, arguing that novice performance is limited by a lack of knowledge rather than a scarcity of resources. The focus on learning avoids many problems with the modal view that stem from its focus on resource limitations. Automaticity is an important phenomenon in everyday men-tal life. Most of us recognize that we perform routine activities quickly and effortlessly, with little thought and conscious aware-ness--in short, automatically (James, 1890). As a result, we of-ten perform those activities on "automatic pilot " and turn our minds to other things. For example, we can drive to dinner while conversing in depth with a visiting scholar, or we can make coffee while planning dessert. However, these benefits may be offset by costs. The automatic pilot can lead us astray, caus-ing errors and sometimes catastrophes (Reason & Myceilska, 1982). If the conversation is deep enough, we may find ourselves and the scholar arriving at the office rather than the restaurant, or we may discover that we aren't sure whether we put two or three scoops of coffee into the pot. Automaticity is also an important phenomenon in skill acqui-sition (e.g., Bryan & Harter, 1899). Skills are thought to consist largely of collections of automatic processes and procedures
On the time course of perceptual choice: the leaky competing accumulator model
- PSYCHOLOGICAL REVIEW
, 2001
"... The time course of perceptual choice is discussed in a model based on gradual and stochastic accumulation of information in non-linear decision units with leakage (or decay of activation) and competition through lateral inhibition. In special cases, the model becomes equivalent to a classical diffus ..."
Abstract
-
Cited by 480 (19 self)
- Add to MetaCart
(Show Context)
The time course of perceptual choice is discussed in a model based on gradual and stochastic accumulation of information in non-linear decision units with leakage (or decay of activation) and competition through lateral inhibition. In special cases, the model becomes equivalent to a classical diffusion process, but leakage and mutual inhibition work together to address several challenges to existing diffusion, random-walk, and accumulator models. The model provides a good account of data from choice tasks using both time-controlled (e.g., deadline or response signal) and standard reaction time paradigms and its overall adequacy compares favorably with that of other approaches. An experimental paradigm that explicitly controls the timing of information supporting different choice alternatives provides further support. The model captures flexible choice behavior regardless of the number of alternatives, accounting for the linear slowing of reaction time as a function of the log of the number of alternatives (Hick’s law) and explains a complex pattern of visual and contextual priming effects in visual word identification. Perceptual Choice 2 When an experience presents itself to the senses, the need often arises to determine its identity or to make some other judgment about it. In experimental paradigms, the time course of this judgment process is
Schema abstraction‖ in a multiple-trace memory model
- Psychological Review
, 1986
"... A simulation model of episodic memory, MINERVA 2, is applied to the learning of concepts, as represented bythe schema-abstraction task. The model assumes that each experience produces a separate memory trace and that knowledge of abstract oncepts i derived from the pool of episodic traces at the tim ..."
Abstract
-
Cited by 359 (2 self)
- Add to MetaCart
(Show Context)
A simulation model of episodic memory, MINERVA 2, is applied to the learning of concepts, as represented bythe schema-abstraction task. The model assumes that each experience produces a separate memory trace and that knowledge of abstract oncepts i derived from the pool of episodic traces at the time of retrieval. A retrieval cue contacts all traces imultaneously, activating each according to its similarity to the cue, and the information retrieved from memory reflects the summed content of all activated traces responding in parallel. The MINERVA 2 model is able to retrieve an abstracted prototype of the category when cued with the category name and to retrieve and disambiguate a category name when cued with a category exemplar. The model successfully predicts basic findings from the schema-abstraction literature (.g., differential forgetting of proto-types and old instances, typicality, and category size effects), including some that have been cited as evidence against exemplar theories of concepts. The model is compared to other classification models, and its implications regarding the abstraction problem are discussed. How is abstract knowledge related to specific experience? In present-day terms, this question concerns the relationship be-tween episodic and generic memories. This article explores the
Judgments of frequency and recognition memory in a multiple-trace memory model (Tech
- University of Oregon Cognitive Science Program
, 1986
"... The multiple-trace simulation model, MINERVA 2, was applied to a number of phenomena found in experiments on relative and absolute judgments of frequency, and forced-choice and yes-no recognition memory. How the basic model deals with effects of repetition, forgetting, list length, orientation task, ..."
Abstract
-
Cited by 300 (3 self)
- Add to MetaCart
(Show Context)
The multiple-trace simulation model, MINERVA 2, was applied to a number of phenomena found in experiments on relative and absolute judgments of frequency, and forced-choice and yes-no recognition memory. How the basic model deals with effects of repetition, forgetting, list length, orientation task, selective retrieval, and similarity and how a slightly modified version accounts for effects of contextual variability on frequency judgments were shown. Two new experiments on similarity and recognition memory were presented, together with appropriate simulations; attempts to modify the model to deal with additional phenomena were also described. Questions related to the representation of frequency are addressed, and the model is evaluated and compared with related models of frequency judgments and recognition memory. Although memory for specific events (episodic memory) and memory for abstract concepts (generic memory) seem quite different intuitively, experimental evidence for different underlying systems is sparse (see McKoon, Ratcliff, & Dell, 1986; Ratcliff & McKoon, 1986; Tulving, 1986). One suggestion has been that the two systems are affected differently by repetition,
Methods for dealing with reaction time outliers
- Psychological Bulletin
, 1993
"... The effect of outliers on reaction time analyses is evaluated. The first section assesses the power of different methods of minimizing the effect of outliers on analysis of variance (ANOVA) and makes recommendations about the use of transformations and cutoffs. The second section examines the effect ..."
Abstract
-
Cited by 274 (6 self)
- Add to MetaCart
(Show Context)
The effect of outliers on reaction time analyses is evaluated. The first section assesses the power of different methods of minimizing the effect of outliers on analysis of variance (ANOVA) and makes recommendations about the use of transformations and cutoffs. The second section examines the effect of outliers and cutoffs on different measures of location, spread, and shape and concludes using quantitative examples that robust measures are much less affected by outliers and cutoffs than measures based on moments. The third section examines fitting explicit distribution functions as a way of recovering means and standard deviations and concludes that unless fitting the distribution function is used as a model of distribution shape, the method is probably not worth routine use. Almost everyone who has analyzed reaction time data has been faced with the problem of what to do with outlier response times. Outliers are response times generated by processes that are not the ones being studied. The processes that generate out-liers can be fast guesses, guesses that are based on the subject's estimate of the usual time to respond, multiple runs of the pro-cess that is actually under study, the subject's inattention, or
Orthographic processing in visual word recognition: a multiple read-out model, Psychol
- 518–565. Edwards et al. / Cognitive Brain Research 24 (2005) 648–662 661
, 1996
"... A model of orthographic processing is described that postulates read-out from different information dimensions, determined by variable response criteria set on these dimensions. Performance in a perceptual identification task is simulated as the percentage of trials on which a noisy criterion set on ..."
Abstract
-
Cited by 266 (34 self)
- Add to MetaCart
(Show Context)
A model of orthographic processing is described that postulates read-out from different information dimensions, determined by variable response criteria set on these dimensions. Performance in a perceptual identification task is simulated as the percentage of trials on which a noisy criterion set on the dimension of single word detector activity is reached. Two additional criteria set on the dimensions of total lexical activity and time from stimulus onset are hypothesized to be operational in the lexical decision task. These additional criteria flexibly adjust to changes in stimulus material and task demands, thus accounting for strategic influences on performance in this task. The model unifies results obtained in response-limited and data-limited paradigms and helps resolve a number of inconsistencies in the experimental literature that cannot be accommodated by other current models of visual word recognition. When skilled readers move their gaze across lines of printed text in order to make sense of letter sequences and spaces, it is very likely that for each word an elementary set of operations is repeated in the brain. These operations compute a form representation of the physical signal, match it with abstract representations stored in long-term memory, and select a (best) candidate for identification. This basic process, generally referred to as word recognition (although the terms word identification and lexical access are popular synonyms), has been one of the major issues in cognitive psychology in the last two decades (for review, see Carr & Pollatsek, 1985; Jacobs &
Decision field theory: A dynamic-cognitive approach to decision making (Tech
, 1989
"... Decision field theory provides for a mathematical foundation leading to a dynamic, stochastic theory of decision behavior in an uncertain environment. This theory is used to explain (a) viola-tions of stochastic dominance, (b) violations of strong stochastic transitivity, (c) violations of inde-pend ..."
Abstract
-
Cited by 264 (14 self)
- Add to MetaCart
(Show Context)
Decision field theory provides for a mathematical foundation leading to a dynamic, stochastic theory of decision behavior in an uncertain environment. This theory is used to explain (a) viola-tions of stochastic dominance, (b) violations of strong stochastic transitivity, (c) violations of inde-pendence between alternatives, (d) serial position effects on preference, (e) speed-accuracy trade-off effects in decision making, (f) the inverse relation between choice probability and decision time, (g) changes in the direction of preference under time pressure, (h) slower decision times for avoidance as compared with approach conflicts, and (i) preference reversals between choice and selling price measures of preference. The proposed theory is compared with 4 other theories of decision making under uncertainty. Beginning with von Neumann and Morgenstern's (1947) classic expected utility theory, steady progress has been made in the development of formal theories of decision making under risk and uncertainty. For rational theorists, the goal has been to formulate a logical foundation for representing the pref-erences of an ideal decision maker (e.g., Machina, 1982; Savage,