Egocentric hearing: Study clarifies how we can tell where a sound is coming from
16 June 2017
A new UCL and University of Nottingham study has found that most neurons in the brain's auditory cortex detect where a sound is coming from relative to the head, but some are tuned to a sound source's actual position in the world.
The study, published in PLOS Biology, looked at whether head movements change the responses of neurons that track sound location.
"Our brains can represent sound location in either an egocentric manner - for example, when I can tell that a phone is ringing to my left - or in an allocentric manner - hearing that the phone is on the table. If I move my head, neurons with an egocentric focus will respond differently, as the phone's position relative to my ears has changed, while the allocentric neurons will maintain their response," said the study's first author, Dr Stephen Town (UCL Ear Institute).
The researchers monitored ferrets while they moved around a small arena surrounded by speakers that emitted clicking sounds. Electrodes monitored the firing rates of neurons in the ferrets' auditory cortex, while LEDs were used to track the animals' movement.
Among the neurons under investigation that picked up sound location, the study showed that most displayed egocentric orientations by tracking where a sound source was relative to the animal's head, but approximately 20% of the spatially tuned neurons instead tracked a sound source's actual location in the world, independent of the ferret's head movements.
The researchers also found that neurons were more sensitive to sound location when the ferret's head was moving quickly.
"Most previous research into how we determine where a sound is coming from used participants with fixed head positions, which failed to differentiate between egocentric and allocentric tuning. Here we found that both types coexist in the auditory cortex," said the study's senior author, Dr Jennifer Bizley (UCL Ear Institute).
The researchers say their findings could be helpful in the design of technologies involving augmented or virtual reality.
"We often hear sounds presented though earphones as being inside our heads, but our findings suggest sound sources could be created to appear externally, in the world, if designers incorporate information about body and head movements," Dr Town said.
The study was funded by the Medical Research Council, Human Frontiers Science Foundation, Wellcome and the Biotechnology and Biological Sciences Research Council.
- Research paper in PLOS Biology
- Dr Stephen Town's academic profile
- Dr Jennifer Bizley's academic profile
- UCL Ear Institute
- Ferret. Source: Selbe Lynn on Flickr
Tel: +44 (0)20 7679 9222
Email: chris.lane [at] ucl.ac.uk