2016 begins with a visit to our collaborators at NTT human communication labs, Atsugi Japan. Here Hsin-I (NTT) and Sijia (UCL) are piloting an eye tracking experiment to investigate auditory perceptual salience.
An early Christmas present from our collaborators in Eriksholm - Oticon: an in-ear EEG system!.
Instead of measuring brain activity using electrodes placed on the scalp (e.g. see photo at the top of this page), we can now measure brain responses using these small hearing-aid like devices inserted into the ear (3 electrodes in each ear canal).
We are very excited to try this out on our various paradigms.
If successful, this technology is set to revolutionize hearing aids, by enabling them to modulate their output, in real time, based on signals measured from the user's brain.
Team Chait-Bizley are working hard at the Ear Institue Christmas quiz.
...and they won (the very respectable) second place.
Read it here:
A demo of a simple (work-in-progress) auditory-based brain-computer interface machine created by
Daniel Wong (ENS, Paris) and Jens Hjortkjær (DTU, Denmark) as
part of our COCOHA project. The short (pilot) clip was filmed during a recent work visit to DTU.
Soren is wearing an EEG cap which is recording his brain
responses. He is listening to a speech by president Obama,
embedded in background audience noise. The device uses signals from Soren’s
brain to suppress the background noise: The more intently Soren listens to Barak Obama’s
voice, the clearer it becomes. In the video – the background noise is initially
quite loud but as Soren concentrates on the speech, Pr. Obama’s voice becomes
clearer (E.g. around 0.08’’). It then
becomes noisy again as Soren withdraws his attention. Better quality video will be posted soon.
COCOHA aims to create a new generation of hearing aids
that can be controlled based on the listeners’ brain signals. It is based on a collabraotion between our Lab with partners in DTU, ENS, UZH and Eriksholm.
(They get the location of auditory cortex a bit wrong though....)
Selected media coverage of our recent Journal of Neuroscience paper on 'inattentional deafness'
(VICE magazine asked us to comment. Read the article here)
We are on the screening panel for this year's Flame Challenge : "What is sound?"
The Flame Challenge is an international competition where scientists answer the question in a way that is most appropriate for 11-year-olds. Entries are judged by thousands of 5th and 6th grade schoolchildren around the world.
Lefkothea's PhD thesis is titled: "Sensitivity to temporal structure in sound supports Auditory scene analysis - a psychophysics and MEG investigation". She was examined by Prof Sonja Kotz, Maastricht University, and Prof. Stuart Rosen (aka Heisenberg!), UCL .
Congraulations to Sijia for winning the MMN conference travel award.
in2science students Oyinmiebi and Laura who have been interning at the Ear Institute these last two weeks under the supervision of Daniel Bates from our lab and Katie Smith (Jagger Lab) are demonstrating their newly acquired, and very impressive, EEG prep skills.
Makoto Yoneya, an engineer from NTT, Japan, is visiting the lab this summer. His visit is part of an ongoing collaboration with NTT, also supported by the BBSRC. Makoto's experiments use eye tracking to determine whether micro-saccades (very small eye movements) can be used to understand auditory attention.
The lab is out on the town, visiting the 'Decision' exhibit at the Hayward Gallery.
The giant (3 story!) slide.
Apparently, world's most complicated clock.
Walking through a virtual forest:
upside down London:
The Lab Project is an experimental month long exhibition and events program that explores the interactions between art and science. Our work is featured in the 'step 1' symposium which brings together scientists and creative practitioners to discuss possible 'entanglements' between science and art: " How can we use sound to affect one's experience of their surroundings?"
We organized a two day workshop on "Sensory Systems in Complex Environments" which brought together researchers (PIs and students) from UCL and Paris.
More details are here:
A few photos:
Jonathan Simon visits
Read about the seventh sense "Inaudible sounds" project:
Read the interview with Sijia on the BBC site:
24/01/2015: The COCOHA project - "Cognitive Steering of a Hearing Aid" is officially launched in Paris. Exciting 4 years ahead!
We were just awarded a PhD studentship by AoHL.
The project is titled: "Evaluating hearing impaired listeners’
sensitivity to changes in dynamic, complex acoustic scenes -
assessing auditory impairment and the benefit attained from a hearing
aid". The position (with a starting date of September 2015) will be
Chait lab (-flu victims+friends) at Christmas dinner 2014. Alas after we have eaten all the food!
- Our work on sensitivity to patterns in sound featured in 5Hz Labs at the Arnolfini in Bristol (16/11/2014)
- UCL Faculty of Brain Sciences 'Meet the Researcher' project video
- We have just been awarded a Royal Society International Exchange grant to support a new project with Juanita Todd in Newcastle, Australia. (01/07/2014).
- Lucie Aman wins the MSc Student prize at the 15th Queen Square Neuroscience Symposium
- Two lab members: Lucie Aman and Anissa Bellahcen, participating in the UCL opera production of Rimsky-Korsakov's "The Snowmaiden"
- Our work featured on BBC Radio 4's "All in the mind" (24.12.2013)
- Maria Chait to speak in the BNA 2013 Christmas Symposium
- Brain picks out salient sounds from background noise by tracking frequency and time
- Hearing brains are 'deaf' to the disappearance of sounds
- UCL researchers use unique machine to deepen understanding of how brain processes sound
- Maria Chait talks about the brain (presentation targeted towards middle school students as part of the "My hearing my future" campaign).
Page last modified on 27 jan 16 14:23