Peter Hagoort

Publications

Displaying 1 - 25 of 25
  • Adank, P., Hagoort, P., & Bekkering, H. (2010). Imitation improves language comprehension. Psychological Science, 21, 1903-1909. doi:10.1177/0956797610389192.

    Abstract

    Humans imitate each other during social interaction. This imitative behavior streamlines social interaction and aids in learning to replicate actions. However, the effect of imitation on action comprehension is unclear. This study investigated whether vocal imitation of an unfamiliar accent improved spoken-language comprehension. Following a pretraining accent comprehension test, participants were assigned to one of six groups. The baseline group received no training, but participants in the other five groups listened to accented sentences, listened to and repeated accented sentences in their own accent, listened to and transcribed accented sentences, listened to and imitated accented sentences, or listened to and imitated accented sentences without being able to hear their own vocalizations. Posttraining measures showed that accent comprehension was most improved for participants who imitated the speaker’s accent. These results show that imitation may aid in streamlining interaction by improving spoken-language comprehension under adverse listening conditions.
  • Baggio, G., Choma, T., Van Lambalgen, M., & Hagoort, P. (2010). Coercion and compositionality. Journal of Cognitive Neuroscience, 22, 2131-2140. doi:10.1162/jocn.2009.21303.

    Abstract

    Research in psycholinguistics and in the cognitive neuroscience of language has suggested that semantic and syntactic integration are associated with different neurophysiologic correlates, such as the N400 and the P600 in the ERPs. However, only a handful of studies have investigated the neural basis of the syntax–semantics interface, and even fewer experiments have dealt with the cases in which semantic composition can proceed independently of the syntax. Here we looked into one such case—complement coercion—using ERPs. We compared sentences such as, “The journalist wrote the article” with “The journalist began the article.” The second sentence seems to involve a silent semantic element, which is expressed in the first sentence by the head of the VP “wrote the article.” The second type of construction may therefore require the reader to infer or recover from memory a richer event sense of the VP “began the article,” such as began writing the article, and to integrate that into a semantic representation of the sentence. This operation is referred to as “complement coercion.” Consistently with earlier reading time, eye tracking, and MEG studies, we found traces of such additional computations in the ERPs: Coercion gives rise to a long-lasting negative shift, which differs at least in duration from a standard N400 effect. Issues regarding the nature of the computation involved are discussed in the light of a neurocognitive model of language processing and a formal semantic analysis of coercion.
  • Bastiaansen, M. C. M., Magyari, L., & Hagoort, P. (2010). Syntactic unification operations are reflected in oscillatory dynamics during on-line sentence comprehension. Journal of Cognitive Neuroscience, 22, 1333-1347. doi:10.1162/jocn.2009.21283.

    Abstract

    There is growing evidence suggesting that synchronization changes in the oscillatory neuronal dynamics in the EEG or MEG reflect the transient coupling and uncoupling of functional networks related to different aspects of language comprehension. In this work, we examine how sentence-level syntactic unification operations are reflected in the oscillatory dynamics of the MEG. Participants read sentences that were either correct, contained a word category violation, or were constituted of random word sequences devoid of syntactic structure. A time-frequency analysis of MEG power changes revealed three types of effects. The first type of effect was related to the detection of a (word category) violation in a syntactically structured sentence, and was found in the alpha and gamma frequency bands. A second type of effect was maximally sensitive to the syntactic manipulations: A linear increase in beta power across the sentence was present for correct sentences, was disrupted upon the occurrence of a word category violation, and was absent in syntactically unstructured random word sequences. We therefore relate this effect to syntactic unification operations. Thirdly, we observed a linear increase in theta power across the sentence for all syntactically structured sentences. The effects are tentatively related to the building of a working memory trace of the linguistic input. In conclusion, the data seem to suggest that syntactic unification is reflected by neuronal synchronization in the lower-beta frequency band.
  • Fournier, R., Gussenhoven, C., Jensen, O., & Hagoort, P. (2010). Lateralization of tonal and intonational pitch processing: An MEG study. Brain Research, 1328, 79-88. doi:10.1016/j.brainres.2010.02.053.

    Abstract

    An MEG experiment was carried out in order to compare the processing of lexical-tonal and intonational contrasts, based on the tonal dialect of Roermond (the Netherlands). A set of words with identical phoneme sequences but distinct pitch contours, which represented different lexical meanings or discourse meanings (statement vs. question), were presented to native speakers as well as to a control group of speakers of Standard Dutch, a non-tone language. The stimuli were arranged in a mismatch paradigm, under three experimental conditions: in the first condition (lexical), the pitch contour differences between standard and deviant stimuli reflected differences between lexical meanings; in the second condition (intonational), the stimuli differed in their discourse meaning; in the third condition (combined), they differed both in their lexical and discourse meaning. In all three conditions, native as well as non-native responses showed a clear MMNm (magnetic mismatch negativity) in a time window from 150 to 250 ms after the divergence point of standard and deviant pitch contours. In the lexical condition, a stronger response was found over the left temporal cortex of native as well as non-native speakers. In the intonational condition, the same activation pattern was observed in the control group, but not in the group of native speakers, who showed a right-hemisphere dominance instead. Finally, in the combined (lexical and intonational) condition, brain reactions appeared to represent the summation of the patterns found in the other two conditions. In sum, the lateralization of pitch processing is condition-dependent in the native group only, which suggests that language experience determines how processes should be distributed over both temporal cortices, according to the functions available in the grammar.
  • Groen, W. B., Tesink, C. M. J. Y., Petersson, K. M., Van Berkum, J. J. A., Van der Gaag, R. J., Hagoort, P., & Buitelaar, J. K. (2010). Semantic, factual, and social language comprehension in adolescents with autism: An fMRI study. Cerebral Cortex, 20(8), 1937-1945. doi:10.1093/cercor/bhp264.

    Abstract

    Language in high-functioning autism is characterized by pragmatic and semantic deficits, and people with autism have a reduced tendency to integrate information. Because the left and right inferior frontal (LIF and RIF) regions are implicated with integration of speaker information, world knowledge, and semantic knowledge, we hypothesized that abnormal functioning of the LIF and RIF regions might contribute to pragmatic and semantic language deficits in autism. Brain activation of sixteen 12- to 18-year-old, high-functioning autistic participants was measured with functional magnetic resonance imaging during sentence comprehension and compared with that of twenty-six matched controls. The content of the pragmatic sentence was congruent or incongruent with respect to the speaker characteristics (male/female, child/adult, and upper class/lower class). The semantic- and world-knowledge sentences were congruent or incongruent with respect to semantic expectancies and factual expectancies about the world, respectively. In the semanticknowledge and world-knowledge condition, activation of the LIF region did not differ between groups. In sentences that required integration of speaker information, the autism group showed abnormally reduced activation of the LIF region. The results suggest that people with autism may recruit the LIF region in a different manner in tasks that demand integration of social information.
  • Junge, C., Hagoort, P., Kooijman, V., & Cutler, A. (2010). Brain potentials for word segmentation at seven months predict later language development. In K. Franich, K. M. Iserman, & L. L. Keil (Eds.), Proceedings of the 34th Annual Boston University Conference on Language Development. Volume 1 (pp. 209-220). Somerville, MA: Cascadilla Press.
  • Junge, C., Cutler, A., & Hagoort, P. (2010). Ability to segment words from speech as a precursor of later language development: Insights from electrophysiological responses in the infant brain. In M. Burgess, J. Davey, C. Don, & T. McMinn (Eds.), Proceedings of 20th International Congress on Acoustics, ICA 2010. Incorporating Proceedings of the 2010 annual conference of the Australian Acoustical Society (pp. 3727-3732). Australian Acoustical Society, NSW Division.
  • Kos, M., Vosse, T. G., Van den Brink, D., & Hagoort, P. (2010). About edible restaurants: Conflicts between syntax and semantics as revealed by ERPs. Frontiers in Psychology, 1, E222. doi:10.3389/fpsyg.2010.00222.

    Abstract

    In order to investigate conflicts between semantics and syntax, we recorded ERPs, while participants read Dutch sentences. Sentences containing conflicts between syntax and semantics (Fred eats in a sandwich…/ Fred eats a restaurant…) elicited an N400. These results show that conflicts between syntax and semantics not necessarily lead to P600 effects and are in line with the processing competition account. According to this parallel account the syntactic and semantic processing streams are fully interactive and information from one level can influence the processing at another level. The relative strength of the cues of the processing streams determines which level is affected most strongly by the conflict. The processing competition account maintains the distinction between the N400 as index for semantic processing and the P600 as index for structural processing.
  • Noordzij, M. L., Newman-Norlund, S. E., De Ruiter, J. P., Hagoort, P., Levinson, S. C., & Toni, I. (2010). Neural correlates of intentional communication. Frontiers in Neuroscience, 4, E188. doi:10.3389/fnins.2010.00188.

    Abstract

    We know a great deal about the neurophysiological mechanisms supporting instrumental actions, i.e. actions designed to alter the physical state of the environment. In contrast, little is known about our ability to select communicative actions, i.e. actions directly designed to modify the mental state of another agent. We have recently provided novel empirical evidence for a mechanism in which a communicator selects his actions on the basis of a prediction of the communicative intentions that an addressee is most likely to attribute to those actions. The main novelty of those finding was that this prediction of intention recognition is cerebrally implemented within the intention recognition system of the communicator, is modulated by the ambiguity in meaning of the communicative acts, and not by their sensorimotor complexity. The characteristics of this predictive mechanism support the notion that human communicative abilities are distinct from both sensorimotor and linguistic processes.
  • Pijnacker, J., Geurts, B., Van Lambalgen, M., Buitelaar, J., & Hagoort, P. (2010). Exceptions and anomalies: An ERP study on context sensitivity in autism. Neuropsychologia, 48, 2940-2951. doi:10.1016/j.neuropsychologia.2010.06.003.

    Abstract

    Several studies have demonstrated that people with ASD and intact language skills still have problems processing linguistic information in context. Given this evidence for reduced sensitivity to linguistic context, the question arises how contextual information is actually processed by people with ASD. In this study, we used event-related brain potentials (ERPs) to examine context sensitivity in high-functioning adults with autistic disorder (HFA) and Asperger syndrome at two levels: at the level of sentence processing and at the level of solving reasoning problems. We found that sentence context as well as reasoning context had an immediate ERP effect in adults with Asperger syndrome, as in matched controls. Both groups showed a typical N400 effect and a late positive component for the sentence conditions, and a sustained negativity for the reasoning conditions. In contrast, the HFA group demonstrated neither an N400 effect nor a sustained negativity. However, the HFA group showed a late positive component which was larger for semantically anomalous sentences than congruent sentences. Because sentence context had a modulating effect in a later phase, semantic integration is perhaps less automatic in HFA, and presumably more elaborate processes are needed to arrive at a sentence interpretation.
  • De Ruiter, J. P., Noordzij, M. L., Newman-Norlund, S., Hagoort, P., Levinson, S. C., & Toni, I. (2010). Exploring the cognitive infrastructure of communication. Interaction studies, 11, 51-77. doi:10.1075/is.11.1.05rui.

    Abstract

    Human communication is often thought about in terms of transmitted messages in a conventional code like a language. But communication requires a specialized interactive intelligence. Senders have to be able to perform recipient design, while receivers need to be able to do intention recognition, knowing that recipient design has taken place. To study this interactive intelligence in the lab, we developed a new task that taps directly into the underlying abilities to communicate in the absence of a conventional code. We show that subjects are remarkably successful communicators under these conditions, especially when senders get feedback from receivers. Signaling is accomplished by the manner in which an instrumental action is performed, such that instrumentally dysfunctional components of an action are used to convey communicative intentions. The findings have important implications for the nature of the human communicative infrastructure, and the task opens up a line of experimentation on human communication.
  • Simanova, I., Van Gerven, M., Oostenveld, R., & Hagoort, P. (2010). Identifying object categories from event-related EEG: Toward decoding of conceptual representations. Plos One, 5(12), E14465. doi:10.1371/journal.pone.0014465.

    Abstract

    Multivariate pattern analysis is a technique that allows the decoding of conceptual information such as the semantic category of a perceived object from neuroimaging data. Impressive single-trial classification results have been reported in studies that used fMRI. Here, we investigate the possibility to identify conceptual representations from event-related EEG based on the presentation of an object in different modalities: its spoken name, its visual representation and its written name. We used Bayesian logistic regression with a multivariate Laplace prior for classification. Marked differences in classification performance were observed for the tested modalities. Highest accuracies (89% correctly classified trials) were attained when classifying object drawings. In auditory and orthographical modalities, results were lower though still significant for some subjects. The employed classification method allowed for a precise temporal localization of the features that contributed to the performance of the classifier for three modalities. These findings could help to further understand the mechanisms underlying conceptual representations. The study also provides a first step towards the use of concept decoding in the context of real-time brain-computer interface applications.
  • Snijders, T. M., Petersson, K. M., & Hagoort, P. (2010). Effective connectivity of cortical and subcortical regions during unification of sentence structure. NeuroImage, 52, 1633-1644. doi:10.1016/j.neuroimage.2010.05.035.

    Abstract

    In a recent fMRI study we showed that left posterior middle temporal gyrus (LpMTG) subserves the retrieval of a word's lexical-syntactic properties from the mental lexicon (long-term memory), while left posterior inferior frontal gyrus (LpIFG) is involved in unifying (on-line integration of) this information into a sentence structure (Snijders et al., 2009). In addition, the right IFG, right MTG, and the right striatum were involved in the unification process. Here we report results from a psychophysical interactions (PPI) analysis in which we investigated the effective connectivity between LpIFG and LpMTG during unification, and how the right hemisphere areas and the striatum are functionally connected to the unification network. LpIFG and LpMTG both showed enhanced connectivity during the unification process with a region slightly superior to our previously reported LpMTG. Right IFG better predicted right temporal activity when unification processes were more strongly engaged, just as LpIFG better predicted left temporal activity. Furthermore, the striatum showed enhanced coupling to LpIFG and LpMTG during unification. We conclude that bilateral inferior frontal and posterior temporal regions are functionally connected during sentence-level unification. Cortico-subcortical connectivity patterns suggest cooperation between inferior frontal and striatal regions in performing unification operations on lexical-syntactic representations retrieved from LpMTG.
  • Van Leeuwen, T. M., Petersson, K. M., & Hagoort, P. (2010). Synaesthetic colour in the brain: Beyond colour areas. A functional magnetic resonance imaging study of synaesthetes and matched controls. PLoS One, 5(8), E12074. doi:10.1371/journal.pone.0012074.

    Abstract

    Background: In synaesthesia, sensations in a particular modality cause additional experiences in a second, unstimulated modality (e.g., letters elicit colour). Understanding how synaesthesia is mediated in the brain can help to understand normal processes of perceptual awareness and multisensory integration. In several neuroimaging studies, enhanced brain activity for grapheme-colour synaesthesia has been found in ventral-occipital areas that are also involved in real colour processing. Our question was whether the neural correlates of synaesthetically induced colour and real colour experience are truly shared. Methodology/Principal Findings: First, in a free viewing functional magnetic resonance imaging (fMRI) experiment, we located main effects of synaesthesia in left superior parietal lobule and in colour related areas. In the left superior parietal lobe, individual differences between synaesthetes (projector-associator distinction) also influenced brain activity, confirming the importance of the left superior parietal lobe for synaesthesia. Next, we applied a repetition suppression paradigm in fMRI, in which a decrease in the BOLD (blood-oxygenated-level-dependent) response is generally observed for repeated stimuli. We hypothesized that synaesthetically induced colours would lead to a reduction in BOLD response for subsequently presented real colours, if the neural correlates were overlapping. We did find BOLD suppression effects induced by synaesthesia, but not within the colour areas. Conclusions/Significance: Because synaesthetically induced colours were not able to suppress BOLD effects for real colour, we conclude that the neural correlates of synaesthetic colour experience and real colour experience are not fully shared. We propose that synaesthetic colour experiences are mediated by higher-order visual pathways that lie beyond the scope of classical, ventral-occipital visual areas. Feedback from these areas, in which the left parietal cortex is likely to play an important role, may induce V4 activation and the percept of synaesthetic colour.
  • Willems, R. M., Hagoort, P., & Casasanto, D. (2010). Body-specific representations of action verbs: Neural evidence from right- and left-handers. Psychological Science, 21, 67-74. doi:10.1177/0956797609354072.

    Abstract

    According to theories of embodied cognition, understanding a verb like throw involves unconsciously simulating the action of throwing, using areas of the brain that support motor planning. If understanding action words involves mentally simulating one’s own actions, then the neurocognitive representation of word meanings should differ for people with different kinds of bodies, who perform actions in systematically different ways. In a test of the body-specificity hypothesis, we used functional magnetic resonance imaging to compare premotor activity correlated with action verb understanding in right- and left-handers. Righthanders preferentially activated the left premotor cortex during lexical decisions on manual-action verbs (compared with nonmanual-action verbs), whereas left-handers preferentially activated right premotor areas. This finding helps refine theories of embodied semantics, suggesting that implicit mental simulation during language processing is body specific: Right- and lefthanders, who perform actions differently, use correspondingly different areas of the brain for representing action verb meanings.
  • Willems, R. M., Peelen, M. V., & Hagoort, P. (2010). Cerebral lateralization of face-selective and body-selective visual areas depends on handedness. Cerebral Cortex, 20, 1719-1725. doi:10.1093/cercor/bhp234.

    Abstract

    The left-hemisphere dominance for language is a core example of the functional specialization of the cerebral hemispheres. The degree of left-hemisphere dominance for language depends on hand preference: Whereas the majority of right-handers show left-hemispheric language lateralization, this number is reduced in left-handers. Here, we assessed whether handedness analogously has an influence upon lateralization in the visual system. Using functional magnetic resonance imaging, we localized 4 more or less specialized extrastriate areas in left- and right-handers, namely fusiform face area (FFA), extrastriate body area (EBA), fusiform body area (FBA), and human motion area (human middle temporal [hMT]). We found that lateralization of FFA and EBA depends on handedness: These areas were right lateralized in right-handers but not in left-handers. A similar tendency was observed in FBA but not in hMT. We conclude that the relationship between handedness and hemispheric lateralization extends to functionally lateralized parts of visual cortex, indicating a general coupling between cerebral lateralization and handedness. Our findings indicate that hemispheric specialization is not fixed but can vary considerably across individuals even in areas engaged relatively early in the visual system.
  • Willems, R. M., De Boer, M., De Ruiter, J. P., Noordzij, M. L., Hagoort, P., & Toni, I. (2010). A dissociation between linguistic and communicative abilities in the human brain. Psychological Science, 21, 8-14. doi:10.1177/0956797609355563.

    Abstract

    Although language is an effective vehicle for communication, it is unclear how linguistic and communicative abilities relate to each other. Some researchers have argued that communicative message generation involves perspective taking (mentalizing), and—crucially—that mentalizing depends on language. We employed a verbal communication paradigm to directly test whether the generation of a communicative action relies on mentalizing and whether the cerebral bases of communicative message generation are distinct from parts of cortex sensitive to linguistic variables. We found that dorsomedial prefrontal cortex, a brain area consistently associated with mentalizing, was sensitive to the communicative intent of utterances, irrespective of linguistic difficulty. In contrast, left inferior frontal cortex, an area known to be involved in language, was sensitive to the linguistic demands of utterances, but not to communicative intent. These findings show that communicative and linguistic abilities rely on cerebrally (and computationally) distinct mechanisms
  • Willems, R. M., & Hagoort, P. (2010). Cortical motor contributions to language understanding. In L. Hermer (Ed.), Reciprocal interactions among early sensory and motor areas and higher cognitive networks (pp. 51-72). Kerala, India: Research Signpost Press.

    Abstract

    Here we review evidence from cognitive neuroscience for a tight relation between language and action in the brain. We focus on two types of relation between language and action. First, we investigate whether the perception of speech and speech sounds leads to activation of parts of the cortical motor system also involved in speech production. Second, we evaluate whether understanding action-related language involves the activation of parts of the motor system. We conclude that whereas there is considerable evidence that understanding language can involve parts of our motor cortex, this relation is best thought of as inherently flexible. As we explain, the exact nature of the input as well as the intention with which language is perceived influences whether and how motor cortex plays a role in language processing.
  • Willems, R. M., Toni, I., Hagoort, P., & Casasanto, D. (2010). Neural dissociations between action verb understanding and motor imagery. Journal of Cognitive Neuroscience, 22(10), 2387-2400. doi:10.1162/jocn.2009.21386.

    Abstract

    According to embodied theories of language, people understand a verb like throw, at least in part, by mentally simulating throwing. This implicit simulation is often assumed to be similar or identical to motor imagery. Here we used fMRI totest whether implicit simulations of actions during language understanding involve the same cortical motor regions as explicit motor imagery Healthy participants were presented with verbs related to hand actions (e.g., to throw) and nonmanual actions (e.g., to kneel). They either read these verbs (lexical decision task) or actively imagined performing the actions named by the verbs (imagery task). Primary motor cortex showd effector-specific activation during imagery, but not during lexical decision. Parts of premotor cortex distinguished manual from nonmanual actions during both lexical decision and imagery, but there was no overlap or correlation between regions activated during the two tasks. These dissociations suggest that implicit simulation and explicit imagery cued by action verbs may involve different types of motor representations and that the construct of “mental simulation” should be distinguished from “mental imagery” in embodied theories of language.
  • Xiang, H.-D., Fonteijn, H. M., Norris, D. G., & Hagoort, P. (2010). Topographical functional connectivity pattern in the perisylvian language networks. Cerebral Cortex, 20, 549-560. doi:10.1093/cercor/bhp119.

    Abstract

    We performed a resting-state functional connectivity study to investigate directly the functional correlations within the perisylvian language networks by seeding from 3 subregions of Broca's complex (pars opercularis, pars triangularis, and pars orbitalis) and their right hemisphere homologues. A clear topographical functional connectivity pattern in the left middle frontal, parietal, and temporal areas was revealed for the 3 left seeds. This is the first demonstration that a functional connectivity topology can be observed in the perisylvian language networks. The results support the assumption of the functional division for phonology, syntax, and semantics of Broca's complex as proposed by the memory, unification, and control (MUC) model and indicated a topographical functional organization in the perisylvian language networks, which suggests a possible division of labor for phonological, syntactic, and semantic function in the left frontal, parietal, and temporal areas.
  • Brown, C. M., & Hagoort, P. (1993). The processing nature of the N400: Evidence from masked priming. Journal of Cognitive Neuroscience, 5, 34-44. doi:10.1162/jocn.1993.5.1.34.

    Abstract

    The N400 is an endogenous event-related brain potential (ERP) that is sensitive to semantic processes during language comprehension. The general question we address in this paper is which aspects of the comprehension process are manifest in the N400. The focus is on the sensitivity of the N400 to the automatic process of lexical access, or to the controlled process of lexical integration. The former process is the reflex-like and effortless behavior of computing a form representation of the linguistic signal, and of mapping this representation onto corresponding entries in the mental lexicon. The latter process concerns the integration of a spoken or written word into a higher-order meaning representation of the context within which it occurs. ERPs and reaction times (RTs) were acquired to target words preceded by semantically related and unrelated prime words. The semantic relationship between a prime and its target has been shown to modulate the amplitude of the N400 to the target. This modulation can arise from lexical access processes, reflecting the automatic spread of activation between words related in meaning in the mental lexicon. Alternatively, the N400 effect can arise from lexical integration processes, reflecting the relative ease of meaning integration between the prime and the target. To assess the impact of automatic lexical access processes on the N400, we compared the effect of masked and unmasked presentations of a prime on the N400 to a following target. Masking prevents perceptual identification, and as such it is claimed to rule out effects from controlled processes. It therefore enables a stringent test of the possible impact of automatic lexical access processes on the N400. The RT study showed a significant semantic priming effect under both unmasked and masked presentations of the prime. The result for masked priming reflects the effect of automatic spreading of activation during the lexical access process. The ERP study showed a significant N400 effect for the unmasked presentation condition, but no such effect for the masked presentation condition. This indicates that the N400 is not a manifestation of lexical access processes, but reflects aspects of semantic integration processes.
  • Hagoort, P. (1993). [Review of the book Language: Structure, processing and disorders, by David Caplan]. Trends in Neurosciences, 16, 124. doi:10.1016/0166-2236(93)90138-C.
  • Hagoort, P. (1993). Impairments of lexical-semantic processing in aphasia: evidence from the processing of lexical ambiguities. Brain and Language, 45, 189-232. doi:10.1006/brln.1993.1043.

    Abstract

    Broca′s and Wernicke′s aphasics performed speeded lexical decisions on the third member of auditorily presented triplets consisting of two word primes followed by either a word or a nonword. In three of the four priming conditions, the second prime was a homonym with two unrelated meanings. The relation of the first prime and the target with the two meanings of the homonym was manipulated in the different priming conditions. The two readings of the ambiguous words either shared their grammatical form class (noun-noun ambiguities) or not (noun-verb ambiguities). The silent intervals between the members of the triplets were varied between 100, 500, and 1250 msec. Priming at the shortest interval is mainly attributed to automatic lexical processing, and priming at the longest interval is mainly due to forms of controlled lexical processing. For both Broca′s and Wernicke′s aphasics overall priming effects were obtained at ISIs of 100 and 500 msec, but not at an ISI of 1250 msec. This pattern of results is consistent with the view that both types of aphasics can automatically access the semantic lexicon, but might be impaired in integrating lexical-semantic information into the context. Broca′s aphasics showed a specific impairment in selecting the contextually appropriate reading of noun-verb ambiguities, which is suggested to result from a failure either in the on-line morphological parsing of complex word forms into a stem and an inflection or in the on-line exploitation of the syntactic implications of the inflectional suffix. In a final experiment patients were asked to explicitly judge the semantic relations between a subset of the primes that were used in the lexical decision study. Wernicke′s aphasics performed worse than both Broca′s aphasics and normal controls, indicating a specific impairment for these patients in consciously operating on automatically accessed lexical-semantic information.
  • Hagoort, P., & Brown, C. M. (1993). Hersenpotentialen als maat voor het menselijk taalvermogen. Stem, Spraak- en Taalpathologie, 2, 213-235.
  • Hagoort, P., Brown, C. M., & Groothusen, J. (1993). The syntactic positive shift (SPS) as an ERP measure of syntactic processing. Language and Cognitive Processes, 8, 439-483. doi:10.1080/01690969308407585.

    Abstract

    This paper presents event-related brain potential (ERP) data from an experiment on syntactic processing. Subjects read individual sentences containing one of three different kinds of violations of the syntactic constraints of Dutch. The ERP results provide evidence for M electrophysiological response to syntactic processing that is qualitatively different from established ERP responses to semantic processing. We refer to this electro-physiological manifestation of parsing as the Syntactic Positive Shift (SPS). The SPS was observed in an experiment in which no task demands, other than to read the input, were imposed on the subjects. The pattern of responses to the different kinds of syntactic violations suggests that the SPS indicates the impossibility for the parser to assign the preferred structure to an incoming string of words, irrespective of the specific syntactic nature of this preferred structure. The implications of these findings for further research on parsing are discussed.

Share this page