Multimodal Language Department
Languages can be expressed and perceived not only through speech or written text but also through visible body expressions (hands, body, and face). All spoken languages use gestures along with speech, and in deaf communities all aspects of language can be expressed through the visible body in sign language. However, the unique contribution of such visible expressions to our understanding of the human language faculty is less understood. The Multimodal Language Department aims to understand how visual features of language, along with speech or in sign languages, constitute a fundamental aspect of the human language capacity, contributing to its unique flexible and adaptive nature. The ambition of the department is to conventionalise the view of language and linguistics as multimodal phenomena.
To this end, we conduct fieldwork on how gestures are used in spoken languages with different linguistic structures - such as word order or prosody - as well as in different sign languages, to understand universal and diverse patterns. The Multimodal Language Department also aims to understand the role of neural, cognitive and linguistic processing mechanisms, requirements of language use in interaction and language transmission (for instance learning constraints) in shaping multimodal structures of language. The general aim therefore is to unravel the cognitive and social foundations of the human ability for language, by considering its multimodal and crosslinguistic diversity as a fundamental design feature.
Our researchers combine multiple methods, such as corpus and computational linguistics, experimental methods, machine learning, AI, and virtual reality, to investigate multimodal language structure, use, processing and transmission. We work with a variety of language users of different signed and spoken languages around the world, as well as with individuals who have different access to sensory experience, such as deaf and blind language users, people in different age groups, and people with autism spectrum disorder.
Multimodal Language Department - YouTube
Asli Ozyurek
Vacancies
-
MACHINE LEARNING STUDENT ASSISTANT
Student AssistantMultimodal Language Departmentpart-timeJoin Our Cutting-Edge Research Team as a Student Assistant in the Multimodal Language Department! We are offering a part-time...
-
18 November 2024
MLD Co-Partners with MULTIDATA Consortium
The Multimodal Language Department is delighted to announce its collaboration with MULTIDATA, a European project (E+ KA220) aimed at revolutionizing multimodal data analysis in higher education.
-
01 November 2024
ISGS 2025 Submission Now Open!
We are thrilled to announce that ISGS 2025 is now accepting submissions for talks, posters, and symposia!
Share this page