Abstract
Visual observation of mouth movements can aid or alter the perception of the acoustic speech signal: Speech comprehension is significantly improved during face- to- face conversation when oral- facial gestures are visible. Recent primate work has suggested that mirror cells in the ventral premotor area are involved in specific hand or mouth movements or the visualization of these same movements. This suggests a mechanism that could account for the improved speech perception, whereby activity in mirror neurons is integrated with activity in auditory (and perhaps other) cortical areas to facilitate speech comprehension. In the present work, we tested the prediction that such activation depends on the direct visualization the oral- facial gestures that accompany speech during face- to- face conversation. Ten right- handed subjects were imaged with fMRI while listening to interesting stories (audio only), listening to stories while seeing the storyteller (audiovisual), or just seeing the storyteller (visual). Comparison across conditions revealed significant (t = 3.916; df = 18; p < .003 corrected) activation in the inferior frontal cortex (BA 44/ 45) in the audiovisual condition that was not present in either other condition. This result suggests that neural activity in the imitation- matching motor system, thought to localize to the inferior frontal cortex in humans, plays a role in speech perception only during natural face- to- face conversation, when oral- facial movements are directly observed, but not when this visual information is absent. The emergence of context- specific patterns of cortical activity during face- to- face conversation may have implications for understanding the integration of motor information with perceptual information.