XClose

UCL News

Home
Menu

Hearing where it's at: How humans and gerbils learn to locate sound

5 August 2004

Humans behave like small mammals when tracing the source of a low-pitched sound, according to a study funded by the Medical Research Council at University College London.

UCL researchers have devised a new model for how the human brain tracks sound, which could eventually help engineers develop technology for tracking sound sources in noisy environments like crowded bars and restaurants.

In the study published in this week's Nature, Dr David McAlpine and Nicol Harper asked volunteers to wander the streets of London wearing microphones in their ears. The microphones measured the time difference between sound arriving at each ear for a range of noises that people typically encounter in the city.

While it was already known that animals and humans use small differences in the arrival time of sound at each ear to locate its source, the UCL study found that the human brain adopts a strategy similar to a barn owl's brain for sound pitches above middle-C, and a gerbil's below middle-C.

David McAlpine says: "For animals and humans, locating the source of a sound can mean the difference between life and death, such as escaping a pursuer or crossing a busy street. Our study suggests that the brain adopts an efficient strategy for doing this, adapting to different frequencies, or pitches, of sound.

"Knowing how the brain creates a sense of sound space is the first step to recreating spatial hearing in the deaf. Recent advances in cochlear implants allow people to have implants in both ears, with the potential to restore spatial hearing."

For over 50 years a single model has been used to explain how brain cells represent the time difference between the ears. The 'classic' model assumes that specific brain cells are allocated to specific time differences, where the relevant cells fire depending on which direction a sound is coming from.

Because different animals need to detect sounds relevant to their own environment, their brain cells shift their tuning until they code most accurately for sounds the animal is likely to encounter.

Recordings from the brain of barn owls - a species that hunts at night using only sound - appear to confirm this. However, the classic model could not account for recent evidence that the brain cells of small mammals appear to respond most to time differences that the animal is never likely to hear.

The alternative model, developed by Nicol Harper in Dr McAlpine's lab, explains this anomaly. Small mammals such as gerbils or guinea pigs can follow low-pitched sounds. Surprisingly, to enhance this ability at low frequencies, the brain cells organise to respond most to time differences outside the range the animal naturally encounters.

This strategy does not suit higher frequencies i.e. higher-pitched sounds. Thus, barn owls' brains follow the classic model of brain cells firing most for time differences within the animal's range. Human brains appear to 'pick and chose' from the different strategies, depending on sound frequency.

Dr McAlpine hopes his findings will help engineers to develop technology to a similar standard to the human brain. Current sound tracking devices work well in quiet places, but suffer considerably in the sort of noisy environments in which humans have little trouble in following a conversation.

Notes for Editors

For more information or to set up an interview please contact Jenny Gimpel on +44 (0)20 7679 9739.

Briefing notes, 'Ten facts on hearing and the ear', can be obtained by e-mailing j.gimpel@ucl.ac.uk .