Peter Hagoort

Publications

Displaying 1 - 27 of 27
  • Coopmans, C. W., De Hoop, H., Tezcan, F., Hagoort, P., & Martin, A. E. (2025). Language-specific neural dynamics extend syntax into the time domain. PLOS Biology, 23: e3002968. doi:10.1371/journal.pbio.3002968.

    Abstract

    Studies of perception have long shown that the brain adds information to its sensory analysis of the physical environment. A touchstone example for humans is language use: to comprehend a physical signal like speech, the brain must add linguistic knowledge, including syntax. Yet, syntactic rules and representations are widely assumed to be atemporal (i.e., abstract and not bound by time), so they must be translated into time-varying signals for speech comprehension and production. Here, we test 3 different models of the temporal spell-out of syntactic structure against brain activity of people listening to Dutch stories: an integratory bottom-up parser, a predictive top-down parser, and a mildly predictive left-corner parser. These models build exactly the same structure but differ in when syntactic information is added by the brain—this difference is captured in the (temporal distribution of the) complexity metric “incremental node count.” Using temporal response function models with both acoustic and information-theoretic control predictors, node counts were regressed against source-reconstructed delta-band activity acquired with magnetoencephalography. Neural dynamics in left frontal and temporal regions most strongly reflect node counts derived by the top-down method, which postulates syntax early in time, suggesting that predictive structure building is an important component of Dutch sentence comprehension. The absence of strong effects of the left-corner model further suggests that its mildly predictive strategy does not represent Dutch language comprehension well, in contrast to what has been found for English. Understanding when the brain projects its knowledge of syntax onto speech, and whether this is done in language-specific ways, will inform and constrain the development of mechanistic models of syntactic structure building in the brain.
  • Ferrari, A., & Hagoort, P. (2025). Beat gestures and prosodic prominence interactively influence language comprehension. Cognition, 256: 106049. doi:10.1016/j.cognition.2024.106049.

    Abstract

    Face-to-face communication is not only about ‘what’ is said but also ‘how’ it is said, both in speech and bodily signals. Beat gestures are rhythmic hand movements that typically accompany prosodic prominence in con-versation. Yet, it is still unclear how beat gestures influence language comprehension. On the one hand, beat gestures may share the same functional role of focus markers as prosodic prominence. Accordingly, they would drive attention towards the concurrent speech and highlight its content. On the other hand, beat gestures may trigger inferences of high speaker confidence, generate the expectation that the sentence content is correct and thereby elicit the commitment to the truth of the statement. This study directly disentangled the two hypotheses by evaluating additive and interactive effects of prosodic prominence and beat gestures on language comprehension. Participants watched videos of a speaker uttering sentences and judged whether each sentence was true or false. Sentences sometimes contained a world knowledge violation that may go unnoticed (‘semantic illusion’). Combining beat gestures with prosodic prominence led to a higher degree of semantic illusion, making more world knowledge violations go unnoticed during language comprehension. These results challenge current theories proposing that beat gestures are visual focus markers. To the contrary, they suggest that beat gestures automatically trigger inferences of high speaker confidence and thereby elicit the commitment to the truth of the statement, in line with Grice’s cooperative principle in conversation. More broadly, our findings also highlight the influence of metacognition on language comprehension in face-to-face ommunication.
  • Mishra, C., Skantze, G., Hagoort, P., & Verdonschot, R. G. (2025). Perception of emotions in human and robot faces: Is the eye region enough? In O. Palinko, L. Bodenhagen, J.-J. Cabihihan, K. Fischer, S. Šabanović, K. Winkle, L. Behera, S. S. Ge, D. Chrysostomou, W. Jiang, & H. He (Eds.), Social Robotics: 116th International Conference, ICSR + AI 2024, Odense, Denmark, October 23–26, 2024, Proceedings (pp. 290-303). Singapore: Springer.

    Abstract

    The increased interest in developing next-gen social robots has raised questions about the factors affecting the perception of robot emotions. This study investigates the impact of robot appearances (human-like, mechanical) and face regions (full-face, eye-region) on human perception of robot emotions. A between-subjects user study (N = 305) was conducted where participants were asked to identify the emotions being displayed in videos of robot faces, as well as a human baseline. Our findings reveal three important insights for effective social robot face design in Human-Robot Interaction (HRI): Firstly, robots equipped with a back-projected, fully animated face – regardless of whether they are more human-like or more mechanical-looking – demonstrate a capacity for emotional expression comparable to that of humans. Secondly, the recognition accuracy of emotional expressions in both humans and robots declines when only the eye region is visible. Lastly, within the constraint of only the eye region being visible, robots with more human-like features significantly enhance emotion recognition.
  • Slivac, K., Hagoort, P., & Flecken, M. (2025). Cognitive and neural mechanisms of linguistic influence on perception. Psychological Review. Advance online publication. doi:10.1037/rev0000546.

    Abstract

    To date, research has reliably shown that language can engage and modify perceptual processes in a top-down manner. However, our understanding of the cognitive and neural mechanisms underlying such top-down influences is still under debate. In this review, we provide an overview of findings from literature investigating the organization of semantic networks in the brain (spontaneous engagement of the visual system while processing linguistic information), and linguistic cueing studies (looking at the immediate effects of language on the perception of a visual target), in an effort to isolate such mechanisms. Additionally, we connect the findings from linguistic cueing studies to those reported in (nonlinguistic) literature on priors in perception, in order to find commonalities in neural processes allowing for top-down influences on perception. In doing so, we discuss the effects of language on perception in the context of broader, general cognitive and neural principles. Finally, we propose a way forward in the study of linguistic influences on perception.
  • Zora, H., Kabak, B., & Hagoort, P. (2025). Relevance of prosodic focus and lexical stress for discourse comprehension in Turkish: Evidence from psychometric and electrophysiological data. Journal of Cognitive Neuroscience, 37(3), 693-736. doi:10.1162/jocn_a_02262.

    Abstract

    Prosody underpins various linguistic domains ranging from semantics and syntax to discourse. For instance, prosodic information in the form of lexical stress modifies meanings and, as such, syntactic contexts of words as in Turkish kaz-má "pickaxe" (noun) versus káz-ma "do not dig" (imperative). Likewise, prosody indicates the focused constituent of an utterance as the noun phrase filling the wh-spot in a dialogue like What did you eat? I ate----. In the present study, we investigated the relevance of such prosodic variations for discourse comprehension in Turkish. We aimed at answering how lexical stress and prosodic focus mismatches on critical noun phrases-resulting in grammatical anomalies involving both semantics and syntax and discourse-level anomalies, respectively-affect the perceived correctness of an answer to a question in a given context. To that end, 80 native speakers of Turkish, 40 participating in a psychometric experiment and 40 participating in an EEG experiment, were asked to judge the acceptability of prosodic mismatches that occur either separately or concurrently. Psychometric results indicated that lexical stress mismatch led to a lower correctness score than prosodic focus mismatch, and combined mismatch received the lowest score. Consistent with the psychometric data, EEG results revealed an N400 effect to combined mismatch, and this effect was followed by a P600 response to lexical stress mismatch. Conjointly, these results suggest that every source of prosodic information is immediately available and codetermines the interpretation of an utterance; however, semantically and syntactically relevant lexical stress information is assigned more significance by the language comprehension system compared with prosodic focus information.
  • Bastiaansen, M. C. M., & Hagoort, P. (2006). Oscillatory neuronal dynamics during language comprehension. In C. Neuper, & W. Klimesch (Eds.), Event-related dynamics of brain oscillations (pp. 179-196). Amsterdam: Elsevier.

    Abstract

    Language comprehension involves two basic operations: the retrieval of lexical information (such as phonologic, syntactic, and semantic information) from long-term memory, and the unification of this information into a coherent representation of the overall utterance. Neuroimaging studies using hemo¬dynamic measures such as PET and fMRI have provided detailed information on which areas of the brain are involved in these language-related memory and unification operations. However, much less is known about the dynamics of the brain's language network. This chapter presents a literature review of the oscillatory neuronal dynamics of EEG and MEG data that can be observed during language comprehen¬sion tasks. From a detailed review of this (rapidly growing) literature the following picture emerges: memory retrieval operations are mostly accompanied by increased neuronal synchronization in the theta frequency range (4-7 Hz). Unification operations, in contrast, induce high-frequency neuronal synchro¬nization in the beta (12-30 Hz) and gamma (above 30 Hz) frequency bands. A desynchronization in the (upper) alpha frequency band is found for those studies that use secondary tasks, and seems to correspond with attentional processes, and with the behavioral consequences of the language comprehension process. We conclude that it is possible to capture the dynamics of the brain's language network by a careful analysis of the event-related changes in power and coherence of EEG and MEG data in a wide range of frequencies, in combination with subtle experimental manipulations in a range of language comprehension tasks. It appears then that neuronal synchrony is a mechanism by which the brain integrates the different types of information about language (such as phonological, orthographic, semantic, and syntactic infor¬mation) represented in different brain areas.
  • Forkstam, C., Hagoort, P., Fernandez, G., Ingvar, M., & Petersson, K. M. (2006). Neural correlates of artificial syntactic structure classification. NeuroImage, 32(2), 956-967. doi:10.1016/j.neuroimage.2006.03.057.

    Abstract

    The human brain supports acquisition mechanisms that extract structural regularities implicitly from experience without the induction of an explicit model. It has been argued that the capacity to generalize to new input is based on the acquisition of abstract representations, which reflect underlying structural regularities in the input ensemble. In this study, we explored the outcome of this acquisition mechanism, and to this end, we investigated the neural correlates of artificial syntactic classification using event-related functional magnetic resonance imaging. The participants engaged once a day during an 8-day period in a short-term memory acquisition task in which consonant-strings generated from an artificial grammar were presented in a sequential fashion without performance feedback. They performed reliably above chance on the grammaticality classification tasks on days 1 and 8 which correlated with a corticostriatal processing network, including frontal, cingulate, inferior parietal, and middle occipital/occipitotemporal regions as well as the caudate nucleus. Part of the left inferior frontal region (BA 45) was specifically related to syntactic violations and showed no sensitivity to local substring familiarity. In addition, the head of the caudate nucleus correlated positively with syntactic correctness on day 8 but not day 1, suggesting that this region contributes to an increase in cognitive processing fluency.
  • Hagoort, P. (2006). On Broca, brain and binding. In Y. Grodzinsky, & K. Amunts (Eds.), Broca's region (pp. 240-251). Oxford: Oxford University Press.
  • Hagoort, P. (2006). What we cannot learn from neuroanatomy about language learning and language processing [Commentary on Uylings]. Language Learning, 56(suppl. 1), 91-97. doi:10.1111/j.1467-9922.2006.00356.x.
  • Hagoort, P. (2006). Het zwarte gat tussen brein en bewustzijn. In J. Janssen, & J. Van Vugt (Eds.), Brein en bewustzijn: Gedachtensprongen tussen hersenen en mensbeeld (pp. 9-24). Damon: Nijmegen.
  • Hagoort, P. (2006). Event-related potentials from the user's perspective [Review of the book An introduction to the event-related potential technique by Steven J. Luck]. Nature Neuroscience, 9(4), 463-463. doi:10.1038/nn0406-463.
  • Hald, L. A., Bastiaansen, M. C. M., & Hagoort, P. (2006). EEG theta and gamma responses to semantic violations in online sentence processing. Brain and Language, 96(1), 90-105. doi:10.1016/j.bandl.2005.06.007.

    Abstract

    We explore the nature of the oscillatory dynamics in the EEG of subjects reading sentences that contain a semantic violation. More specifically, we examine whether increases in theta (≈3–7 Hz) and gamma (around 40 Hz) band power occur in response to sentences that were either semantically correct or contained a semantically incongruent word (semantic violation). ERP results indicated a classical N400 effect. A wavelet-based time-frequency analysis revealed a theta band power increase during an interval of 300–800 ms after critical word onset, at temporal electrodes bilaterally for both sentence conditions, and over midfrontal areas for the semantic violations only. In the gamma frequency band, a predominantly frontal power increase was observed during the processing of correct sentences. This effect was absent following semantic violations. These results provide a characterization of the oscillatory brain dynamics, and notably of both theta and gamma oscillations, that occur during language comprehension.
  • Hoeks, J. C. J., Hendriks, P., Vonk, W., Brown, C. M., & Hagoort, P. (2006). Processing the noun phrase versus sentence coordination ambiguity: Thematic information does not completely eliminate processing difficulty. Quarterly Journal of Experimental Psychology, 59, 1581-1899. doi:10.1080/17470210500268982.

    Abstract

    When faced with the noun phrase (NP) versus sentence (S) coordination ambiguity as in, for example, The thief shot the jeweller and the cop hellip, readers prefer the reading with NP-coordination (e.g., "The thief shot the jeweller and the cop yesterday") over one with two conjoined sentences (e.g., "The thief shot the jeweller and the cop panicked"). A corpus study is presented showing that NP-coordinations are produced far more often than S-coordinations, which in frequency-based accounts of parsing might be taken to explain the NP-coordination preference. In addition, we describe an eye-tracking experiment investigating S-coordinated sentences such as Jasper sanded the board and the carpenter laughed, where the poor thematic fit between carpenter and sanded argues against NP-coordination. Our results indicate that information regarding poor thematic fit was used rapidly, but not without leaving some residual processing difficulty. This is compatible with claims that thematic information can reduce but not completely eliminate garden-path effects.
  • Krott, A., Baayen, R. H., & Hagoort, P. (2006). The nature of anterior negativities caused by misapplications of morphological rules. Journal of Cognitive Neuroscience, 18(10), 1616-1630. doi:10.1162/jocn.2006.18.10.1616.

    Abstract

    This study investigates functional interpretations of left
    anterior negativities (LANs), a language-related electroencephalogram effect that has been found for syntactic and morphological violations. We focus on three possible interpretations of LANs caused by the replacement of irregular affixes with regular affixes: misapplication of morphological rules, mismatch of the presented form with analogy-based expectations, and mismatch of the presented form with stored representations. Event-related brain potentials were recorded during the visual presentation of existing and novel Dutch compounds. Existing compounds contained correct or replaced interfixes (dame + s + salons > damessalons vs. *dame + n + salons > *damensalons ‘‘women’s hairdresser salons’’), whereas novel Dutch compounds contained interfixes that were either supported or not supported by analogy to similar existing compounds
    (kruidenkelken vs. ?kruidskelken ‘‘herb chalices’’); earlier studies had shown that interfixes are selected by analogy instead of rules. All compounds were presented with correct or incorrect regular plural suffixes (damessalons vs. *damessalonnen). Replacing suffixes or interfixes in existing compounds both led to increased (L)ANs between 400 and 700 msec without any evidence for different scalp distributions for interfixes and suffixes. There was no evidence for a negativity when manipulating the analogical support for interfixes in novel compounds. Together with earlier studies, these results suggest that LANs had been caused by the mismatch of the presented forms with stored forms. We discuss these findings with respect to the single/dual-route debate of morphology and LANs found for the misapplication of syntactic rules.
  • Müller, O., & Hagoort, P. (2006). Access to lexical information in language comprehension: Semantics before syntax. Journal of Cognitive Neuroscience, 18(1), 84-96. doi:10.1162/089892906775249997.

    Abstract

    The recognition of a word makes available its semantic and
    syntactic properties. Using electrophysiological recordings, we
    investigated whether one set of these properties is available
    earlier than the other set. Dutch participants saw nouns on a
    computer screen and performed push-button responses: In
    one task, grammatical gender determined response hand
    (left/right) and semantic category determined response execution
    (go/no-go). In the other task, response hand depended
    on semantic category, whereas response execution depended
    on gender. During the latter task, response preparation occurred
    on no-go trials, as measured by the lateralized
    readiness potential: Semantic information was used for
    response preparation before gender information inhibited
    this process. Furthermore, an inhibition-related N2 effect
    occurred earlier for inhibition by semantics than for inhibition
    by gender. In summary, electrophysiological measures
    of both response preparation and inhibition indicated that
    the semantic word property was available earlier than the
    syntactic word property when participants read single
    words.
  • Van den Brink, D., Brown, C. M., & Hagoort, P. (2006). The cascaded nature of lexical selection and integration in auditory sentence processing. Journal of Experimental Psychology: Learning, Memory, and Cognition, 32(3), 364-372. doi:10.1037/0278-7393.32.3.364.

    Abstract

    An event-related brain potential experiment was carried out to investigate the temporal relationship
    between lexical selection and the semantic integration in auditory sentence processing. Participants were
    presented with spoken sentences that ended with a word that was either semantically congruent or
    anomalous. Information about the moment in which a sentence-final word could uniquely be identified,
    its isolation point (IP), was compared with the onset of the elicited N400 congruity effect, reflecting
    semantic integration processing. The results revealed that the onset of the N400 effect occurred prior to
    the IP of the sentence-final words. Moreover, the factor early or late IP did not affect the onset of the
    N400. These findings indicate that lexical selection and semantic integration are cascading processes, in
    that semantic integration processing can start before the acoustic information allows the selection of a
    unique candidate and seems to be attempted in parallel for multiple candidates that are still compatible
    with the bottom–up acoustic input.
  • Hagoort, P., & Indefrey, P. (1997). De neurale architectuur van het menselijk taalvermogen. In H. Peters (Ed.), Handboek stem-, spraak-, en taalpathologie (pp. 1-36). Houten: Bohn Stafleu Van Loghum.
  • Hagoort, P. (1997). De rappe prater als gewoontedier [Review of the book Smooth talkers: The linguistic performance of auctioneers and sportscasters, by Koenraad Kuiper]. Psychologie, 16, 22-23.
  • Hagoort, P., & Van Turennout, M. (1997). The electrophysiology of speaking: Possibilities of event-related potential research for speech production. In W. Hulstijn, H. Peters, & P. Van Lieshout (Eds.), Speech motor production and fluency disorders: Brain research in speech production (pp. 351-361). Amsterdam: Elsevier.
  • Hagoort, P. (1997). Semantic priming in Broca's aphasics at a short SOA: No support for an automatic access deficit. Brain and Language, 56, 287-300. doi:10.1006/brln.1997.1849.

    Abstract

    This study tests the recent claim that Broca’s aphasics are impaired in automatic lexical access, including the retrieval of word meaning. Subjects are required to perform a lexical decision on visually presented prime target pairs. Half of the word targets are preceded by a related word, half by an unrelated word. Primes and targets are presented with a long stimulus-onset-asynchrony (SOA) of 1400 msec and with a short SOA of 300 msec. Normal priming effects are observed in Broca’s aphasics for both SOAs. This result is discussed in the context of the claim that Broca’s aphasics suffer from an impairment in the automatic access of lexical–semantic information. It is argued that none of the current priming studies provides evidence supporting this claim, since with short SOAs priming effects have been reliably obtained in Broca’s aphasics. The results are more compatible with the claim that in many Broca’s aphasics the functional locus of their comprehension deficit is at the level of postlexical integration processes.
  • Hagoort, P., & Wassenaar, M. (1997). Taalstoornissen: Van theorie tot therapie. In B. Deelman, P. Eling, E. De Haan, A. Jennekens, & A. Van Zomeren (Eds.), Klinische Neuropsychologie (pp. 232-248). Meppel: Boom.
  • Hagoort, P. (1997). Zonder fosfor geen gedachten: Gagarin, geest en brein. In Brain & Mind (pp. 6-14). Utrecht: Reünistenvereniging Veritas.
  • Hagoort, P. (1997). Valt er nog te lachen zonder de rechter hersenhelft? Psychologie, 16, 52-55.
  • Indefrey, P., Kleinschmidt, A., Merboldt, K.-D., Krüger, G., Brown, C. M., Hagoort, P., & Frahm, J. (1997). Equivalent responses to lexical and nonlexical visual stimuli in occipital cortex: a functional magnetic resonance imaging study. Neuroimage, 5, 78-81. doi:10.1006/nimg.1996.0232.

    Abstract

    Stimulus-related changes in cerebral blood oxygenation were measured using high-resolution functional magnetic resonance imaging sequentially covering visual occipital areas in contiguous sections. During dynamic imaging, healthy subjects silently viewed pseudowords, single false fonts, or length-matched strings of the same false fonts. The paradigm consisted of a sixfold alternation of an activation and a control task. With pseudowords as activation vs single false fonts as control, responses were seen mainly in medial occipital cortex. These responses disappeared when pseudowords were alternated with false font strings as the control and reappeared when false font strings instead of pseudowords served as activation and were alternated with single false fonts. The string-length contrast alone, therefore, is sufficient to account for the activation pattern observed in medial visual cortex when word-like stimuli are contrasted with single characters.
  • Swaab, T. Y., Brown, C. M., & Hagoort, P. (1997). Spoken sentence comprehension in aphasia: Event-related potential evidence for a lexical integration deficit. Journal of Cognitive Neuroscience, 9(1), 39-66.

    Abstract

    In this study the N400 component of the event-related potential was used to investigate spoken sentence understanding in Broca's and Wernicke's aphasics. The aim of the study was to determine whether spoken sentence comprehension problems in these patients might result from a deficit in the on-line integration of lexical information. Subjects listened to sentences spoken at a normal rate. In half of these sentences, the meaning of the final word of the sentence matched the semantic specifications of the preceding sentence context. In the other half of the sentences, the sentence-final word was anomalous with respect to the preceding sentence context. The N400 was measured to the sentence-final words in both conditions. The results for the aphasic patients (n = 14) were analyzed according to the severity of their comprehension deficit and compared to a group of 12 neurologically unimpaired age-matched controls, as well as a group of 6 nonaphasic patients with a lesion in the right hemisphere. The nonaphasic brain damaged patients and the aphasic patients with a light comprehension deficit (high comprehenders, n = 7) showed an N400 effect that was comparable to that of the neurologically unimpaired subjects. In the aphasic patients with a moderate to severe comprehension deficit (low comprehenders, n = 7), a reduction and delay of the N400 effect was obtained. In addition, the P300 component was measured in a classical oddball paradigm, in which subjects were asked to count infrequent low tones in a random series of high and low tones. No correlation was found between the occurrence of N400 and P300 effects, indicating that changes in the N400 results were related to the patients' language deficit. Overall, the pattern of results was compatible with the idea that aphasic patients with moderate to severe comprehension problems are impaired in the integration of lexical information into a higher order representation of the preceding sentence context.
  • Van Turennout, M., Hagoort, P., & Brown, C. M. (1997). Electrophysiological evidence on the time course of semantic and phonological processes in speech production. Journal of Experimental Psychology: Learning, Memory, and Cognition, 23(4), 787-806.

    Abstract

    The temporal properties of semantic and phonological processes in speech production were investigated in a new experimental paradigm using movement-related brain potentials. The main experimental task was picture naming. In addition, a 2-choice reaction go/no-go procedure was included, involving a semantic and a phonological categorization of the picture name. Lateralized readiness potentials (LRPs) were derived to test whether semantic and phonological information activated motor processes at separate moments in time. An LRP was only observed on no-go trials when the semantic (not the phonological) decision determined the response hand. Varying the position of the critical phoneme in the picture name did not affect the onset of the LRP but rather influenced when the LRP began to differ on go and no-go trials and allowed the duration of phonological encoding of a word to be estimated. These results provide electrophysiological evidence for early semantic activation and later phonological encoding.
  • Wassenaar, M., Hagoort, P., & Brown, C. M. (1997). Syntactic ERP effects in Broca's aphasics with agrammatic comprehension. Brain and Language, 60, 61-64. doi:10.1006/brln.1997.1911.

Share this page