Publications
Displaying 101 - 113 of 113
-
Peeters, D., Chu, M., Holler, J., Hagoort, P., & Ozyurek, A. (2015). Electrophysiological and kinematic correlates of communicative intent in the planning and production of pointing gestures and speech. Journal of Cognitive Neuroscience, 27(12), 2352-2368. doi:10.1162/jocn_a_00865.
Abstract
In everyday human communication, we often express our communicative intentions by manually pointing out referents in the material world around us to an addressee, often in tight synchronization with referential speech. This study investigated whether and how the kinematic form of index finger pointing gestures is shaped by the gesturer's communicative intentions and how this is modulated by the presence of concurrently produced speech. Furthermore, we explored the neural mechanisms underpinning the planning of communicative pointing gestures and speech. Two experiments were carried out in which participants pointed at referents for an addressee while the informativeness of their gestures and speech was varied. Kinematic and electrophysiological data were recorded online. It was found that participants prolonged the duration of the stroke and poststroke hold phase of their gesture to be more communicative, in particular when the gesture was carrying the main informational burden in their multimodal utterance. Frontal and P300 effects in the ERPs suggested the importance of intentional and modality-independent attentional mechanisms during the planning phase of informative pointing gestures. These findings contribute to a better understanding of the complex interplay between action, attention, intention, and language in the production of pointing gestures, a communicative act core to human interaction. -
Peeters, D., Hagoort, P., & Ozyurek, A. (2015). Electrophysiological evidence for the role of shared space in online comprehension of spatial demonstratives. Cognition, 136, 64-84. doi:10.1016/j.cognition.2014.10.010.
Abstract
A fundamental property of language is that it can be used to refer to entities in the extra-linguistic physical context of a conversation in order to establish a joint focus of attention on a referent. Typological and psycholinguistic work across a wide range of languages has put forward at least two different theoretical views on demonstrative reference. Here we contrasted and tested these two accounts by investigating the electrophysiological brain activity underlying the construction of indexical meaning in comprehension. In two EEG experiments, participants watched pictures of a speaker who referred to one of two objects using speech and an index-finger pointing gesture. In contrast with separately collected native speakers’ linguistic intuitions, N400 effects showed a preference for a proximal demonstrative when speaker and addressee were in a face-to-face orientation and all possible referents were located in the shared space between them, irrespective of the physical proximity of the referent to the speaker. These findings reject egocentric proximity-based accounts of demonstrative reference, support a sociocentric approach to deixis, suggest that interlocutors construe a shared space during conversation, and imply that the psychological proximity of a referent may be more important than its physical proximity. -
Peeters, D., Snijders, T. M., Hagoort, P., & Ozyurek, A. (2015). The role of left inferior frontal Gyrus in the integration of point- ing gestures and speech. In G. Ferré, & M. Tutton (
Eds. ), Proceedings of the4th GESPIN - Gesture & Speech in Interaction Conference. Nantes: Université de Nantes.Abstract
Comprehension of pointing gestures is fundamental to human communication. However, the neural mechanisms
that subserve the integration of pointing gestures and speech in visual contexts in comprehension
are unclear. Here we present the results of an fMRI study in which participants watched images of an
actor pointing at an object while they listened to her referential speech. The use of a mismatch paradigm
revealed that the semantic unication of pointing gesture and speech in a triadic context recruits left
inferior frontal gyrus. Complementing previous ndings, this suggests that left inferior frontal gyrus
semantically integrates information across modalities and semiotic domains. -
Samur, D., Lai, V. T., Hagoort, P., & Willems, R. M. (2015). Emotional context modulates embodied metaphor comprehension. Neuropsychologia, 78, 108-114. doi:10.1016/j.neuropsychologia.2015.10.003.
Abstract
Emotions are often expressed metaphorically, and both emotion and metaphor are ways through which abstract meaning can be grounded in language. Here we investigate specifically whether motion-related verbs when used metaphorically are differentially sensitive to a preceding emotional context, as compared to when they are used in a literal manner. Participants read stories that ended with ambiguous action/motion sentences (e.g., he got it), in which the action/motion could be interpreted metaphorically (he understood the idea) or literally (he caught the ball) depending on the preceding story. Orthogonal to the metaphorical manipulation, the stories were high or low in emotional content. The results showed that emotional context modulated the neural response in visual motion areas to the metaphorical interpretation of the sentences, but not to their literal interpretations. In addition, literal interpretations of the target sentences led to stronger activation in the visual motion areas as compared to metaphorical readings of the sentences. We interpret our results as suggesting that emotional context specifically modulates mental simulation during metaphor processing -
Simanova, I., Van Gerven, M. A., Oostenveld, R., & Hagoort, P. (2015). Predicting the semantic category of internally generated words from neuromagnetic recordings. Journal of Cognitive Neuroscience, 27(1), 35-45. doi:10.1162/jocn_a_00690.
Abstract
In this study, we explore the possibility to predict the semantic category of words from brain signals in a free word generation task. Participants produced single words from different semantic categories in a modified semantic fluency task. A Bayesian logistic regression classifier was trained to predict the semantic category of words from single-trial MEG data. Significant classification accuracies were achieved using sensor-level MEG time series at the time interval of conceptual preparation. Semantic category prediction was also possible using source-reconstructed time series, based on minimum norm estimates of cortical activity. Brain regions that contributed most to classification on the source level were identified. These were the left inferior frontal gyrus, left middle frontal gyrus, and left posterior middle temporal gyrus. Additionally, the temporal dynamics of brain activity underlying the semantic preparation during word generation was explored. These results provide important insights about central aspects of language production -
Todorovic, A., Schoffelen, J.-M., van Ede, F., Maris, E., & de Lange, F. P. (2015). Temporal expectation and attention jointly modulate auditory oscillatory activity in the beta band. PLoS One, 10(3): e0120288. doi:10.1371/journal.pone.0120288.
Abstract
The neural response to a stimulus is influenced by endogenous factors such as expectation and attention. Current research suggests that expectation and attention exert their effects in opposite directions, where expectation decreases neural activity in sensory areas, while attention increases it. However, expectation and attention are usually studied either in isolation or confounded with each other. A recent study suggests that expectation and attention may act jointly on sensory processing, by increasing the neural response to expected events when they are attended, but decreasing it when they are unattended. Here we test this hypothesis in an auditory temporal cueing paradigm using magnetoencephalography in humans. In our study participants attended to, or away from, tones that could arrive at expected or unexpected moments. We found a decrease in auditory beta band synchrony to expected (versus unexpected) tones if they were unattended, but no difference if they were attended. Modulations in beta power were already evident prior to the expected onset times of the tones. These findings suggest that expectation and attention jointly modulate sensory processing. -
Udden, J., & Schoffelen, J.-M. (2015). Mother of all Unification Studies (MOUS). In A. E. Konopka (
Ed. ), Research Report 2013 | 2014 (pp. 21-22). Nijmegen: Max Planck Institute for Psycholinguistics. doi:10.17617/2.2236748. -
Van den Bos, E., & Poletiek, F. H. (2015). Learning simple and complex artificial grammars in the presence of a semantic reference field: Effects on performance and awareness. Frontiers in Psychology, 6: 158. doi:10.3389/fpsyg.2015.00158.
Abstract
This study investigated whether the negative effect of complexity on artificial grammar learning could be compensated by adding semantics. Participants were exposed to exemplars from a simple or a complex finite state grammar presented with or without a semantic reference field. As expected, performance on a grammaticality judgment test was higher for the simple grammar than for the complex grammar. For the simple grammar, the results also showed that participants presented with a reference field and instructed to decode the meaning of each exemplar (decoding condition) did better than participants who memorized the exemplars without semantic referents (memorize condition). Contrary to expectations, however, there was no significant difference between the decoding condition and the memorize condition for the complex grammar. These findings indicated that the negative effect of complexity remained, despite the addition of semantics. To clarify how the presence of a reference field influenced the learning process, its effects on the acquisition of two types of knowledge (first- and second-order dependencies) and on participants’ awareness of their knowledge were examined. The results tentatively suggested that the reference field enhanced the learning of second-order dependencies. In addition, participants in the decoding condition realized when they had knowledge relevant to making a grammaticality judgment, whereas participants in the memorize condition demonstrated some knowledge of which they were unaware. These results are in line with the view that the reference field enhanced structure learning by making certain dependencies more salient. Moreover, our findings stress the influence of complexity on artificial grammar learningAdditional information
data sheet 1.pdf -
Veenstra, A., Meyer, A. S., & Acheson, D. J. (2015). Effects of parallel planning on agreement production. Acta Psychologica, 162, 29-39. doi:10.1016/j.actpsy.2015.09.011.
Abstract
An important issue in current psycholinguistics is how the time course of utterance planning affects the generation of grammatical structures. The current study investigated the influence of parallel activation of the components of complex noun phrases on the generation of subject-verb agreement. Specifically, the lexical interference account (Gillespie, M. and Pearlmutter, N. J., 2011b and Solomon, E. S. and Pearlmutter, N. J., 2004) predicts more agreement errors (i.e., attraction) for subject phrases in which the head and local noun mismatch in number (e.g., the apple next to the pears) when nouns are planned in parallel than when they are planned in sequence. We used a speeded picture description task that yielded sentences such as the apple next to the pears is red. The objects mentioned in the noun phrase were either semantically related or unrelated. To induce agreement errors, pictures sometimes mismatched in number. In order to manipulate the likelihood of parallel processing of the objects and to test the hypothesized relationship between parallel processing and the rate of agreement errors, the pictures were either placed close together or far apart. Analyses of the participants' eye movements and speech onset latencies indicated slower processing of the first object and stronger interference from the related (compared to the unrelated) second object in the close than in the far condition. Analyses of the agreement errors yielded an attraction effect, with more errors in mismatching than in matching conditions. However, the magnitude of the attraction effect did not differ across the close and far conditions. Thus, spatial proximity encouraged parallel processing of the pictures, which led to interference of the associated conceptual and/or lexical representation, but, contrary to the prediction, it did not lead to more attraction errors. -
Wang, L., Bastiaansen, M. C. M., & Yang, Y. (2015). ERP responses to person names as a measure of trait inference in person perception. Social Neuroscience, 10, 89-99. doi:10.1080/17470919.2014.944995.
Abstract
Using event-related potentials (ERPs), this study examines how trait information inferred from behaviors is associated with person names. In linguistic discourses, person names were associated with descriptions of either positive or negative behaviors. In a subsequent explicit evaluation task, the previously described person names were presented in isolation, and the participants were asked to judge the emotional valence of these names. We found that the names associated with positive descriptions elicited a larger positivity in the ERP than the names associated with negative descriptions. The results indicate that the emotional valence of person names attached to person perception can be dynamically influenced by short descriptions of the target person, probably due to trait inference based on the provided behavioral descriptionsFiles private
Request files -
Willems, R. M. (
Ed. ). (2015). Cognitive neuroscience of natural language use. Cambridge: Cambridge University Press. -
Willems, R. M. (2015). Cognitive neuroscience of natural language use: Introduction. In Cognitive neuroscience of natural language use (pp. 1-7). Cambridge: Cambridge University Press.
-
Xiang, H., Van Leeuwen, T. M., Dediu, D., Roberts, L., Norris, D. G., & Hagoort, P. (2015). L2-proficiency-dependent laterality shift in structural connectivity of brain language pathways. Brain Connectivity, 5(6), 349-361. doi:10.1089/brain.2013.0199.
Abstract
Diffusion tensor imaging (DTI) and a longitudinal language learning approach were applied to investigate the relationship between the achieved second language (L2) proficiency during L2 learning and the reorganization of structural connectivity between core language areas. Language proficiency tests and DTI scans were obtained from German students before and after they completed an intensive 6-week course of the Dutch language. In the initial learning stage, with increasing L2 proficiency, the hemispheric dominance of the BA6-temporal pathway (mainly along the arcuate fasciculus) shifted from the left to the right hemisphere. With further increased proficiency, however, lateralization dominance was again found in the left BA6-temporal pathway. This result is consistent with reports in the literature that imply a stronger involvement of the right hemisphere in L2-processing especially for less proficient L2-speakers. This is the first time that a L2-proficiency-dependent laterality shift in structural connectivity of language pathways during L2 acquisition has been observed to shift from left to right, and back to left hemisphere dominance with increasing L2-proficiency. We additionally find that changes in fractional anisotropy values after the course are related to the time elapsed between the two scans. The results suggest that structural connectivity in (at least part of) the perisylvian language network may be subject to fast dynamic changes following language learning
Share this page