Does the brain process signed and spoken language in the same way?
Remarkably the answer to this is 'mostly, yes'! Deaf signers who have suffered damage to the brain show very similar language problems as hearing speakers with damage to the same brain regions. For example, damage to the front of the brain on the left side can lead to problems with both sign and speech production. We can also use new methods, for example functional magnetic resonance imaging (fMRI), to look at healthy brains to find out which parts of the brain are active when people are processing signed and spoken language. These studies also suggest that very similar networks in the brain are used for the two languages, even though they are delivered in very different ways.
However, the networks used are not identical. Sign language processing uses the visual parts of the brain more than speech and, not surprisingly, spoken language uses auditory parts of the brain more than sign language.
In addition to these sensory differences, there are other subtle ways in which brain processing of signed and spoken language differs. For example, there is growing evidence that a part of the brain called the parietal lobe (at the back of the brain, towards the top), which is involved in processing spatial relationships, is more involved in sign language processing than spoken language processing (see FAQ 10 above).
It is likely that findings comparing how signed and spoken language are processed by looking at behaviour, such as accuracy or reaction times, will also highlight differences between how the languages are processed. Further research into the brain should be able to explore this in the future.