Results 1 - 10
of
108
A rose by any other name: Long term memory structure and sentence processing
- Journal of Memory and Language
, 1999
"... The effects of sentential context and semantic memory structure during on-line sentence processing were examined by recording event-related brain potentials as individuals read pairs of sentences for comprehension. The first sentence established an expectation for a particular exemplar of a semantic ..."
Abstract
-
Cited by 97 (14 self)
- Add to MetaCart
The effects of sentential context and semantic memory structure during on-line sentence processing were examined by recording event-related brain potentials as individuals read pairs of sentences for comprehension. The first sentence established an expectation for a particular exemplar of a semantic category, while the second ended with (1) that expected exemplar, (2) an unexpected exemplar from the same (expected) category, or (3) an unexpected item from a different (unexpected) category. Expected endings elicited a positivity between 250 and 550 ms while all unexpected endings elicited an N400, which was significantly smaller to items from the expected category. This N400 reduction varied with the strength of the contextually induced expectation: unexpected, categorically related endings elicited smaller N400s in more constraining contexts, despite their poorer fit to context (lower plausibility). This pattern of effects is best explained as reflecting the impact of context-independent long-term memory structure on sentence processing. The results thus suggest that physical and functional similarities that hold between objects in the world—i.e., category structure—influence neural organization and, in turn, routine language comprehension processes. © 1999 Academic Press Key Words: sentence processing; categorization; event-related potentials; N400. At its heart, language comprehension involves
N400-like Magnetoencephalography Responses Modulated by Semantic Context, Word Frequency, and Lexical Class in Sentences
, 2002
"... Words have been found to elicit a negative potential at the scalp peaking at �400 ms that is strongly modulated by semantic context. The current study used whole-head magnetoencephalography (MEG) as male subjects read sentences ending with semantically congruous or incongruous words. Compared with c ..."
Abstract
-
Cited by 59 (4 self)
- Add to MetaCart
Words have been found to elicit a negative potential at the scalp peaking at �400 ms that is strongly modulated by semantic context. The current study used whole-head magnetoencephalography (MEG) as male subjects read sentences ending with semantically congruous or incongruous words. Compared with congruous words, sentence-terminal incongruous words consistently evoked a large magnetic field over the left hemisphere, peaking at �450 ms. Source modeling at this latency with conventional equivalent current dipoles (ECDs) placed the N400m generator in or near the left superior temporal sulcus. A distributed solution constrained to the cortical surface suggested a sequence of differential activation, beginning in Wernicke’s area at �250 ms, spreading to anterior temporal sites at �270 ms, to Broca’s area by
The Brain Basis of Syntactic Processes: Functional Imaging and Lesion Studies
- NeuroImage
"... Language comprehension can be subdivided into three processing steps: initial structure building, semantic integration, and late syntactic integration. The two syntactic processing phases are correlated with two distinct components in the event-related brain potential, namely an early left anterior ..."
Abstract
-
Cited by 51 (1 self)
- Add to MetaCart
Language comprehension can be subdivided into three processing steps: initial structure building, semantic integration, and late syntactic integration. The two syntactic processing phases are correlated with two distinct components in the event-related brain potential, namely an early left anterior negativity (ELAN) and a late centroparietal positivity (P600). Moreover, ERP findings from healthy adults suggest that early structure-building processes as reflected by the ELAN are independent of semantic processes. fMRI results have revealed that semantic and syntactic processes are supported by separable temporofrontal networks, with the syntactic processes involving the left superior temporal gyrus (STG), the left frontal operculum, and the basal ganglia (BG) in particular. MEG data from healthy adults have indicated that the left anterior temporal region and the left inferior frontal region subserve the early structure building processes. ERP data from patients with lesions in the left anterior temporal region and from patients with lesions in the left inferior frontal gyrus support this view, as these patients do not demonstrate an ELAN, although they do demonstrate a P600. Further results from patients with BG dysfunction suggest that parts of this subcortical structure are involved in late syntactic integrational processes. The data from the different experiments lead to the notion of separable brain systems responsible for early and late syntactic processes, with the former being subserved by the inferior frontal gyrus and the anterior STG and the latter being supported by the BG and more posterior portions of the STG. © 2003 Elsevier Inc. All rights reserved.
Electrophysiological evidence for early contextual influences during spoken-word recognition: N200 versus N400 effects
- Journal of Cognitive Neuroscience
, 2001
"... & An event-related brain potential experiment was carried out to investigate the time course of contextual influences on spoken-word recognition. Subjects were presented with spoken sentences that ended with a word that was either (a) congruent, (b) semantically anomalous, but beginning with the ..."
Abstract
-
Cited by 48 (6 self)
- Add to MetaCart
& An event-related brain potential experiment was carried out to investigate the time course of contextual influences on spoken-word recognition. Subjects were presented with spoken sentences that ended with a word that was either (a) congruent, (b) semantically anomalous, but beginning with the same initial phonemes as the congruent completion, or (c) semantically anomalous beginning with phonemes that differed from the congruent completion. In addition to finding an N400 effect in the two semantically anomalous conditions, we obtained an early negative effect in the semantically anomalous condition where word onset differed from that of the congruent completions. It was concluded that the N200 effect is related to the lexical selection process, where word-form information resulting from an initial phonological analysis and content information derived from the context interact. &
Meaning and modality: Influences of context, semantic memory organization, and perceptual predictability on picture processing
- Journal of Experimental Psychology: Learning, Memory, and Cognition
, 2001
"... Using event-related potentials (ERPs), the authors investigated the influences of sentence context, semantic memory organization, and perceptual predictability on picture processing. Participants read pairs of highly or weakly constraining sentences that ended with (a) the expected item, (b) an unex ..."
Abstract
-
Cited by 45 (4 self)
- Add to MetaCart
Using event-related potentials (ERPs), the authors investigated the influences of sentence context, semantic memory organization, and perceptual predictability on picture processing. Participants read pairs of highly or weakly constraining sentences that ended with (a) the expected item, (b) an unexpected item from the expected semantic category, or (c) an unexpected item from an unexpected category. Pictures were unfamiliar in Experiment 1 but preexposed in Experiment 2. ERPs to pictures reflected both contextual fit and memory organization, as do ERPs to words in the same contexts (K. D. Federmeier & M. Kutas, 1999). However, different response patterns were observed to pictures than to words. Some of these arose from perceptual predictability differences, whereas others seem to reflect true modality-based differences in semantic feature activation. Although words and pictures may share semantic memory, the authors ' results show that semantic processing is not amodal. Words (visual or auditory) and pictures are both physical objects that, through experience, have come to be associated with information not explicitly contained in the physical form of the word or picture itself. In this sense, both pictures and words can be thought of as symbols, or objects that "stand for " information that they do
Who said what? An event-related potential investigation of source and item memory
- Journal of Experimental Psychology: Learning, Memory, and Cognition
, 1998
"... Event-related potentials (ERPs) were recorded uring recognition tasks for spoken words alone (items) or for both words and the voice of the speaker (sources). Neither performance nor ERP measures suggested that voice information was retrieved automatically during the item-recognition task. In both t ..."
Abstract
-
Cited by 45 (4 self)
- Add to MetaCart
Event-related potentials (ERPs) were recorded uring recognition tasks for spoken words alone (items) or for both words and the voice of the speaker (sources). Neither performance nor ERP measures suggested that voice information was retrieved automatically during the item-recognition task. In both tasks, correctly recognized old words elicited more positive ERPs than new words, beginning around 400 ms poststimulus onset. In the source task only, old words also elicited a focal prefrontal positivity beginning about 700 ms. The prefrontal task effect did not distinguish trials with accurate and inaccurate voice judgments and is interpreted as reflecting the search for voice information i memory. More posterior recording sites were sensitive to the successful recovery of voice or source information. The results indicate that word and voice information were retrieved hierarchically and distinguish retrieval attempt from retrieval success. Everyone has had the experience of remembering a fact without being able to recall how it was learned. Remember-ing the source of one's knowledge is not always important, but in some cases, it may be critical for one's subsequent
When Language Meets Action: The Neural Integration of Gesture and Speech
- CEREBRAL CORTEX
, 2006
"... Although generally studied in isolation, language and action often co-occur in everyday life. Here we investigated one particular form of simultaneous language and action, namely speech and gestures that speakers use in everyday communication. In a functional magnetic resonance imaging study, we ide ..."
Abstract
-
Cited by 43 (7 self)
- Add to MetaCart
Although generally studied in isolation, language and action often co-occur in everyday life. Here we investigated one particular form of simultaneous language and action, namely speech and gestures that speakers use in everyday communication. In a functional magnetic resonance imaging study, we identified the neural networks involved in the integration of semantic information from speech and gestures. Verbal and/or gestural content could be integrated easily or less easily with the content of the preceding part of speech. Premotor areas involved in action observation (Brodmann area [BA] 6) were found to be specifically modulated by action information "mismatching" to a language context. Importantly, an increase in integration load of both verbal and gestural information into prior speech context activated Broca’s area and adjacent cortex (BA 45/47). A classical language area, Broca’s area, is not only recruited for language-internal processing but also when action observation is integrated with speech. These findings provide direct evidence that action and language processing share a high-level neural integration system.
An ERP study of the processing of subject and object relative clauses
- in Japanese. Language and Cognitive Processes
, 2008
"... ow nl oa de d B ..."
Phonotactic knowledge and lexical–semantic processing in one-year-olds: Brain responses to words and nonsense words in picture contexts
- Journal of Cognitive Neuroscience
, 2005
"... & During their first year of life, infants not only acquire probabilistic knowledge about the phonetic, prosodic, and phonotactic organization of their native language, but also begin to establish first lexical–semantic representations. The present study investigated the sensitivity to phonotact ..."
Abstract
-
Cited by 23 (1 self)
- Add to MetaCart
& During their first year of life, infants not only acquire probabilistic knowledge about the phonetic, prosodic, and phonotactic organization of their native language, but also begin to establish first lexical–semantic representations. The present study investigated the sensitivity to phonotactic regularities and its impact on semantic processing in 1-year-olds. We applied the method of event-related brain potentials to 12- and 19-month-old children and to an adult control group. While looking at pictures of known objects, subjects listened to spoken nonsense words that were phonotactically legal (pseudowords) or had phonotactically illegal word onsets (nonwords), or to real words that were either congruous or incongruous to the picture contents. In 19-month-olds and in adults, incongruous words and pseudowords, but not non-words, elicited an N400 known to ref lect mechanisms of semantic integration. For congruous words, the N400 was attenuated by semantic priming. In contrast, 12-month-olds did not show an N400 difference, neither between pseudo- and nonwords nor between incongruous and congruous words. Both 1-year-old groups and adults additionally displayed a lexical priming effect for congruous words, that is, a negativity starting around 100 msec after words onset. One-year-olds, moreover, displayed a phonotactic familiarity effect, that is, a widely distributed negativity starting around 250 msec in 19-month-olds but occurring later in 12-month-olds. The results imply that both lexical priming and phonotactic familiarity already affect the processing of acoustic stimuli in children at 12 months of age. In 19-month-olds, adult-like mechanisms of semantic integration are present in response to phonotactically legal, but not to phonotactically illegal, non-sense words, indicating that children at this age treat pseudo-words, but not nonwords, as potential word candidates. &