The unification of language and action: An embrained perspective
In my presentation I will summarize the results of a number of ERP and fMRI studies on the processing and integration of co-occurring speech and gestures/pantomimes. The ERP results indicate that the time course of integrating language and action (i.e., gesture) is very similar to that of integrating linguistic meaning into a sentence or discourse representation. Moreover, in both cases the Left Inferior Frontal cortex plays a central role in orchestrating the multimodal unification of language and action. This orchestration is partly done by modulating temporal areas that store representations activated by the input. I will discuss the parameters of this modulation.
Publication type
TalkPublication date
2009
Share this page