Results 1 - 10
of
700
An integrated theory of the mind
- PSYCHOLOGICAL REVIEW
, 2004
"... There has been a proliferation of proposed mental modules in an attempt to account for different cognitive functions but so far there has been no successful account of their integration. ACT-R (Anderson & Lebiere, 1998) has evolved into a theory that consists of multiple modules but also explain ..."
Abstract
-
Cited by 780 (73 self)
- Add to MetaCart
(Show Context)
There has been a proliferation of proposed mental modules in an attempt to account for different cognitive functions but so far there has been no successful account of their integration. ACT-R (Anderson & Lebiere, 1998) has evolved into a theory that consists of multiple modules but also explains how they are integrated to produce coherent cognition. The perceptual-motor modules, the goal module, and the declarative memory module are presented as examples of specialized systems in ACT-R. These modules are associated with distinct cortical regions. These modules place chunks in buffers where they can be detected by a production system that responds to patterns of information in the buffers. At any point in time a single production rule is selected to respond to the current pattern. Subsymbolic processes serve to guide the selection of rules to fire as well as the internal operations of some modules. Much of learning involves tuning of these subsymbolic processes. Empirical examples are presented that illustrate the predictions of ACT-R’s modules. In addition, two models of complex tasks are described to illustrate how these modules result in strong predictions when they are brought together. One of these models is concerned with complex patterns of behavioral data in a dynamic task and the other is concerned with fMRI data obtained in a study of symbol manipulation.
The lexical nature of syntactic ambiguity resolution
- Psychological Review
, 1994
"... Ambiguity resolution is a central problem in language comprehension. Lexical and syntactic ambiguities are standardly assumed to involve different types of knowledge representations and be resolved by different mechanisms. An alternative account is provided in which both types of ambiguity derive fr ..."
Abstract
-
Cited by 557 (24 self)
- Add to MetaCart
Ambiguity resolution is a central problem in language comprehension. Lexical and syntactic ambiguities are standardly assumed to involve different types of knowledge representations and be resolved by different mechanisms. An alternative account is provided in which both types of ambiguity derive from aspects of lexical representation and are resolved by the same processing mechanisms. Reinterpreting syntactic ambiguity resolution as a form of lexical ambiguity resolution obviates the need for special parsing principles to account for syntactic interpretation preferences, reconciles a number of apparently conflicting results concerning the roles of lexical and contextual information in sentence processing, explains differences among ambiguities in terms of ease of resolution, and provides a more unified account of language comprehension than was previously available. One of the principal goals for a theory of language compre- third section we consider processing issues: how information is hension is to explain how the reader or listener copes with a processed within the mental lexicon and how contextual inforpervasive ambiguity problem. Languages are structured at mation can influence processing. The central processing mechmultiple levels simultaneously, including lexical, phonological, anism we invoke is the constraint satisfaction process that has morphological, syntactic, and text or discourse levels. At any been realized in interactive-activation models (e.g., Elman &
Linguistic Complexity: Locality of Syntactic Dependencies
- COGNITION
, 1998
"... This paper proposes a new theory of the relationship between the sentence processing mechanism and the available computational resources. This theory -- the Syntactic Prediction Locality Theory (SPLT) -- has two components: an integration cost component and a component for the memory cost associa ..."
Abstract
-
Cited by 504 (31 self)
- Add to MetaCart
This paper proposes a new theory of the relationship between the sentence processing mechanism and the available computational resources. This theory -- the Syntactic Prediction Locality Theory (SPLT) -- has two components: an integration cost component and a component for the memory cost associated with keeping track of obligatory syntactic requirements. Memory cost is
What memory is for
, 1997
"... Let’s start from scratch in thinking about what memory is for, and consequently, how it works. Suppose that memory and conceptualization work in the service of perception and action. In this case, conceptualization is the encoding of patterns of possible physical interaction with a three-dimensiona ..."
Abstract
-
Cited by 396 (5 self)
- Add to MetaCart
Let’s start from scratch in thinking about what memory is for, and consequently, how it works. Suppose that memory and conceptualization work in the service of perception and action. In this case, conceptualization is the encoding of patterns of possible physical interaction with a three-dimensional world. These patterns are constrained by the structure of the environment, the structure of our bodies, and memory. Thus, how we perceive and conceive of the environment is determined by the types of bodies we have. Such a memory would not have associations. Instead, how concepts become related (and what it means to be related) is determined by how separate patterns of actions can be combined given the constraints of our bodies. I call this combination “mesh. ” To avoid hallucination, conceptualization would normally be driven by the environment, and patterns of action from memory would play a supporting, but automatic, role. A significant human skill is learning to suppress the overriding contribution of the environment to conceptualization, thereby allowing memory to guide conceptualization. The effort used in suppressing input from the environment pays off by allowing prediction, recollective memory, and language comprehension. I review theoretical work in cognitive science and empirical work in memory and language comprehension that suggest that it may be possible to investigate connections between topics as disparate as infantile amnesia and mental-model theory.
Memory for goals: an activation-based model
, 2002
"... Goal-directed cognition is often discussed in terms of specialized memory structures like the "goal stack." The goal-activation model presented here analyzes goal-directed cognition in terms of the general memory constructs of activation and associative priming. The model embodies three pr ..."
Abstract
-
Cited by 188 (37 self)
- Add to MetaCart
Goal-directed cognition is often discussed in terms of specialized memory structures like the "goal stack." The goal-activation model presented here analyzes goal-directed cognition in terms of the general memory constructs of activation and associative priming. The model embodies three predictive constraints: (1) the interference level, which arises from residual memory for old goals; (1) the strengthening constraint, which makes predictions about time to encode a new goal; and (3) the priming constraint, which makes predictions about the role of cues in retrieving pending goals. These constraints are formulated algebraically and tested through simulation of latency and error data from the Tower of Hanoi, a means-ends puzzle that depends heavily on suspension and resumption of goals. Implications of the model for understanding intention superiority, postcompletion error, and effects of task interruption are discussed.
Processing Capacity Defined by Relational Complexity: Implications for Comparative, Developmental, and Cognitive Psychology
, 1989
"... It is argued that working memory limitations are best defined in terms of the complexity of relations that can be processed in parallel. Relational complexity is related to processing loads in problem solving, and discriminates between higher animal species, as well as between children of differen ..."
Abstract
-
Cited by 182 (14 self)
- Add to MetaCart
It is argued that working memory limitations are best defined in terms of the complexity of relations that can be processed in parallel. Relational complexity is related to processing loads in problem solving, and discriminates between higher animal species, as well as between children of different ages. Complexity is defined by the number of dimensions, or sources of variation, that are related. A unary relation has one argument and one source of variation, because its argument can be instantiated in only one way at a time. A binary relation has two arguments, and two sources of variation, because two argument instantiations are possible at once. Similarly, a ternary relation is three dimensional, a quaternary relation is four dimensional, and so on. Dimensionality is related to number of chunks, because both attributes on dimensions and chunks are independent units of information of arbitrary size. Empirical studies of working memory limitations indicate a soft limit which corresponds to processing one quaternary relation in parallel. More complex concepts are processed by segmentation or conceptual chunking. Segmentation entails breaking tasks into components which do not exceed processing capacity, and which are processed serially. Conceptual chunking entails "collapsing" representations to reduce their dimensionality and consequently their processing load, but at the cost of making some relational information inaccessible. Parallel distributed processing implementations of relational representations show that relations with more arguments entail a higher computational cost, which corresponds to empirical observations of higher processing loads in humans. Empirical evidence is presented that relational complexity discriminates between higher species...
Localization of syntactic comprehension by positron emission tomography
- Brain Lang
, 1996
"... Positron Emission Tomography (PET) was used to determine regional cerebral blood flow (rCBF) when eight normal right-handed males read and made acceptabil-ity judgments about sentences. rCBF was greater in Broca’s area (particularly in the pars opercularis) when subjects judged the semantic plausibi ..."
Abstract
-
Cited by 178 (4 self)
- Add to MetaCart
(Show Context)
Positron Emission Tomography (PET) was used to determine regional cerebral blood flow (rCBF) when eight normal right-handed males read and made acceptabil-ity judgments about sentences. rCBF was greater in Broca’s area (particularly in the pars opercularis) when subjects judged the semantic plausibility of syntactically more complex sentences as compared to syntactically less complex sentences. rCBF was greater in left perisylvian language areas when subjects had to decide whether sentences were semantically plausible than when subjects had to decide whether syntactically identical sentences contained a nonsense word. The results of this ex-periment suggest that overall sentence processing occurs in regions of the left peri-sylvian association cortex. The results also provide evidence that one particular aspect of sentence processing (the process that corresponds to the greater difficulty of comprehending center-embedded than right-branching relative clause sentences) is centered in the pars opercularis of Broca’s area. This process is likely to be related to the greater memory load associated with processing center-embedded sen-
Toward a connectionist model of recursion in human linguistic performance
- Cognitive Science
, 1999
"... Naturally occurring speech contains only a limited amount of complex recursive structure, and this is reflected in the empirically documented difficulties that people experience when processing such structures. We present a connectionist model of human performance in processing recursive language s ..."
Abstract
-
Cited by 170 (21 self)
- Add to MetaCart
(Show Context)
Naturally occurring speech contains only a limited amount of complex recursive structure, and this is reflected in the empirically documented difficulties that people experience when processing such structures. We present a connectionist model of human performance in processing recursive language structures. The model is trained on simple artificial languages. We find that the qualitative performance profile of the model matches human behavior, both on the relative difficulty of center-embedding and crossdependency, and between the processing of these complex recursive structures and right-branching recursive constructions. We analyze how these differences in performance are reflected in the internal representations of the model by performing discriminant analyses on these representations both before and after training. Furthermore, we show how a network trained to process recursive structures can also generate such structures in a probabilistic fashion. This work suggests a novel explanation of people's limited recursive performance, without assuming the existence of a mentally represented competence grammar allowing unbounded recursion.
An Activation-Based Model of Sentence Processing as Skilled Memory Retrieval
, 2005
"... We present a detailed process theory of the moment-by-moment working-memory retrievals and associated control structure that subserve sentence comprehension. The theory is derived from the application of independently motivated principles of memory and cognitive skill to the specialized task of sent ..."
Abstract
-
Cited by 161 (36 self)
- Add to MetaCart
We present a detailed process theory of the moment-by-moment working-memory retrievals and associated control structure that subserve sentence comprehension. The theory is derived from the application of independently motivated principles of memory and cognitive skill to the specialized task of sentence parsing. The resulting theory construes sentence processing as a series of skilled associative memory retrievals modulated by similarity-based interference and fluctuating activation. The cognitive principles are formalized in computational form in the Adaptive Control of Thought–Rational (ACT–R) architecture, and our process model is realized in ACT–R. We present the results of 6 sets of simulations: 5 simulation sets provide quantitative accounts of the effects of length and structural interference on both unambiguous and garden-path structures. A final simulation set provides a graded taxonomy of double center embeddings ranging from relatively easy to extremely difficult. The explanation of center-embedding difficulty is a novel one that derives from the model’s complete reliance on discriminating retrieval cues in the absence of an explicit representation of serial order information. All fits were obtained with only 1 free scaling parameter fixed across the simulations; all other parameters were ACT–R defaults. The modeling results support the hypothesis that fluctuating activation and similarity-based interference are the key factors shaping working memory in sentence processing. We contrast the theory and empirical predictions with several related accounts of sentence-processing complexity.
Variability of worked examples and transfer of geometrical problem-solving skills: A cognitive-load approach
- Journal of Educational Psychology
, 1994
"... Four computer-based training strategies for geometrical problem solving in the domain of computer numerically controlled machinery programming were studied with regard to their effects on training performance, transfer performance, and cognitive load. A low- and a high-variability conventional condi ..."
Abstract
-
Cited by 149 (30 self)
- Add to MetaCart
(Show Context)
Four computer-based training strategies for geometrical problem solving in the domain of computer numerically controlled machinery programming were studied with regard to their effects on training performance, transfer performance, and cognitive load. A low- and a high-variability conventional condition, in which conventional practice problems had to be solved (followed by worked examples), were compared with a low- and a high-variability worked condition, in which worked examples had to be studied. Results showed that students who studied worked examples gained most from high-variability examples, invested less time and mental effort in practice, and attained better and less effort-demanding transfer performance than students who first attempted to solve conventional problems and then studied work examples. In complex cognitive domains such as mathematics, phys-ics, or computer programming, problem solutions can often be characterized by a hierarchical goal structure. The goal of these solutions can be attained only by successfully attaining all subgoals. Learning and performance of complex cogni-tive tasks are typically constrained by limited processing ca-