Multimodal Language in Aphasia
It is clear that there is a relationship between some co-speech gestures (those that imagistically evoke some characteristics of the referents) and spoken language.
Understanding such a relationship may be key to answering basic questions concerning the nature of meaning-based representations and processes engaged in communication.
It is also clear that there is a relationship between speech and mouth movements. Mouth movements necessarily accompany speech and can provide sensory information (regarding visemes) that support speech processing. We have known for a long time that seeing a speaker helps in auditory speech perception and lexical recognition.
We study how speakers, including aphasic and apraxic patients, integrate information from speech, gesture, and mouth movements in order to understand whether these cues can be used in the neurorehabilitation of aphasia and how. Work with brain-damaged individuals is carried out in collaboration with Drs Laurel Buxbaum and Myrna Schwartz at the Moss Rehabilitation Research Institute in Philadelphia.
Vigliocco, G., Krason, A., Stoll, H., Monti, A. & Buxbaum, L. (2020). Multimodal comprehension in left hemisphere stroke patients. Cortex. Publication available here