Royal Institution Christmas Lectures feature DE-ENIGMA robot
5 January 2018
A robot designed to help teach children on the autistic spectrum to recognise facial emotions has starred in the iconic Royal Institution’s Christmas Lectures.
Zeno the robot featured in the second lecture of three in the series entitled “Silent Messages: The Language of Life – Exploring the world of silent communication, why smells and body language can say so much” which aired at 8pm on BBC Four on Wednesday 27 December.
The 2017 lectures were presented by UCL’s Professor Sophie Scott and explored how humans and other animals use sound, behaviour and language to communicate.
Zeno, part of the DE-ENIGMA Horizon 2020 project, is designed to read and mimic facial expressions and is being used to help children on the autism spectrum understand their own and others’ facial expressions and emotions.
This large project brings together UCL and partner institutions across Europe. Dr Alyssa M. Alcorn, from UCL Institute of Education’s (IOE) Centre for Research in Autism and Education (CRAE), appeared with Zeno in the lecture. Dr Jie Shen, Research Associate in Computing from the DE-ENIGMA team at Imperial College London, was busy behind the scenes managing the technical aspects of the demonstration.
Within the lecture, Dr Alcorn revealed that the robot uses a camera to detect and track 49 points or “landmarks” on a person’s face. By applying a machine learning process, the DE-ENIGMA system estimates what type of emotion the person is showing, and can send instructions to the robot so that it will make the same facial expression.
Zeno performed this with a volunteer from the audience in a live demonstration, tracking his face and imitating his expression in real time.
Dr Alyssa M. Alcorn said: "The Royal Institution Christmas Lectures are such a longstanding scientific tradition, and Zeno and I are now part of that. I stepped out in front of the audience focused on doing what I do every day of our school studies: encouraging kids to engage with the robot and his facial expressions. I think that was really successful, because almost every hand went up when Professor Scott asked for a volunteer!”
Watch the lecture (Dr Alcorn and Zeno appear from 38:30)
The DE-ENIGMA project is investigating whether robot-assisted interventions could present lower, less complex social demands than human-led interventions. It is exploring whether this is a feasible solution for children whose social, daily life, or language skills may present barriers to participation in “traditional” interventions.
The first stage of the DE-ENIGMA project (2016-2017) has tested the feasibility of an emotion-recognition training programme with a large sample of autistic children in the UK and Serbia, as part of the larger goal of developing the potential of intelligent robot-assisted interventions.
Sixty six school-aged autistic children from three London special schools have participated so far, with an additional 66 children participating in Belgrade. Half have been assigned to participate in an emotion recognition teaching programme led by an adult, and half assigned to the same programme assisted by Zeno the robot.
Initial results at this stage suggest that many aspects of the robot-assisted programme were highly successful in both the UK and Serbia, particularly in terms of fostering children’s interest and engagement in the emotion activities.
The DE-ENIGMA project has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 688835.
The robot body is manufactured by RoboKind.
- Centre for Research in Autism and Education (CRAE)
- View Dr Alyssa M. Alcorn's research profile
- Royal Institution Christmas Lecture