XClose

UCL Psychology and Language Sciences

Home
Menu

A situation at hand

A situation at hand: Sign language learning dynamically re-organizes spoken language comprehension and the brain

Abstract


We have proposed that the organization of language and the brain is supported by multiple dynamically organizing networks. The weighting of these networks varies as a function of the “contextual” information available to listeners and their prior experience with that information. By this model, experience with a manual language will weight networks that extract information from co-speech gestures more strongly. We tested this hypothesis with students enrolled in an introductory American Sign Language (ASL) class who watched an actress telling a spoken English story while undergoing 4-dimensional electroencephalography neuroimaging. We predicted that ASL students would show greater sensorimotor network activity when viewing gestures compared to spoken language-learning matched controls. As predicted, ASL learners showed significantly greater activity in parietal cortex during imagistic and less-imagistic gestures and frontal “motor” and anterior temporal “language” regions during imagistic gestures. A direct contrast confirmed that the ASL group had increased activity in these regions for imagistic gestures. Furthermore, parietal, motor, and language regions formed a network for the ASL group with motor preceding language region activity, followed by a reduction of activity in early auditory areas. In conclusion, results suggest that learning ASL weights “co-speech gesture networks” more strongly and does so because the information in more informative (imagistic) gestures can more readily be used in a predictive manner to reduce auditory processing demands. More generally, results are consistent with the hypothesis that the networks underlying the organization of language and the brain are dynamically organized, in part, as a function of prior experience.