Skip to content Skip to menu

Experimental Psychology seminar

Identifying linguistic and neural levels of interaction between gesture and speech during comprehension using EEG, fMRI and TMS

 

Author: Dr Henning Holle, Department of Psychology, University of Hull, document.write("u\056ubyyr@uhyy\056np\056hx".replace(/[a-zA-Z]/g, function(c){return String.fromCharCode((c<="Z"?90:122)>=(c=c.charCodeAt(0)+13)?c:c-26);}));">

Conversational gestures are hand movements that co-occur with speech but do not appear to be consciously produced by the speaker. The role that these gestures play in communication is disputed, with some arguing that gesture adds only little information over and above what is already transmitted by speech alone. My own EEG work has provided strong evidence for the alternative view, suggesting that gestures add substantial information to the comprehension process via highly interactive multimodal mechanisms involving semantic and syntactic aspects of language. I will present fMRI and TMS data suggesting that the left inferior frontal gyrus and left posterior temporal lobe are crucial components of the multimodal brain network for co-speech gesture comprehension. These findings are consistent with the idea that these areas play a joint role in gesture-speech integration, with IFG regulating strategic semantic access via top-down signals acting upon temporal storage areas.