XClose

UCL Psychology and Language Sciences

Home
Menu

Re-visioning language and the brain

Re-visioning language and the brain: Auditory language comprehension is dynamically supported by visual cortex

Abstract

Most models of the organization of language and the brain maintain that auditory language comprehension statically relies on regions surrounding primary auditory cortex. An alternative is that the neural architecture underlying comprehension is more dynamic and can rely on areas outside of auditory cortex. For example, comprehension could be supported by the visual system when listeners’ prior experience with heard speech occurred when reading. This hypothesis was tested with high-density electroencephalography. In Phase I participants listened to unfamiliar music clips paired with sung or written lyrics (e.g., “We sailed the wet seas by boat”). The clips were heard again in Phase II but without accompanying lyrics. A spoken sentence followed each clip. The final word of the sentences was semantically related to the lyrics that originally accompanied the clips (e.g., “She thought about the water”). Alternately, unfamiliar clips preceded sentences. When lyrics accompanying the music had been previously heard in Phase I, auditory cortex was less active (i.e., neurally primed) during the final word when compared to final words accompanied by unfamiliar clips (Figure 1 bottom row). When lyrics accompanying music had been read, visual cortex (including the fusiform gyrus) was less active during the final word (Figure 1 top row). These results demonstrate that instrumental music can reinstate visual word forms that previously accompanied music and that these activate related words and support auditory language comprehension. Results require a revision of existing models of language comprehension and the brain to correspond to a more active, dynamic, and distributed model that includes the effects of recent (visual) experience.