Dr Nadia Berthouze
Reader in Affective Interaction and Computing
Location: Room 8.24
University College London
Telephone: +44 20 7679 0690 (x 30690)
Email: n.berthouze at ucl.ac.uk
The premise of my research is that affect, emotion, and subjective experience should be factored into the design of interactive technology. Indeed, for technology to be truly effective in our social network, it should be able to adapt to the affective needs of each user group or even each individual. The aim of my research is to create systems/software that can sense the affective state of their users and use that information to tailor the interaction process. Body movement appears to be a promising medium for this goal: it supports cognitive processes, regulates emotions, and mediates affective and social communication. I am currently pursuing three lines of research looking at body movement as a medium to induce, recognize and measure the quality of experience of humans and in particularly of humans interacting and engaging through/with technology. I am trying to identify the various factors that affect the recognition process, including cross-cultural differences and task context. Finally, I am looking onto the existence of dialects in affective body movement communication, including avatar-specific dialects. I was awarded a 2 years International Marie Curie Reintegration Grant started in 2006 to investigate these issues in the clinical context and in the gaming industry . More information on the project AffectME supported by this grant can be found here.
My funded work has all been broadly in the area of designing and evaluating technology that is aware of their user's affective experience and can support, regulate or amplify it.
- Emo&Pain: Pain rehabilitation: E/Motion-based automated coaching. EPSRC collaborative grant (EP/H016988/1, 2010-2014)
- Digital SENSORIA. EPSRC Collaborative Grant (EP/H007083/2, 2009-2011)
- Healthy interactive systems: Resilient, usable and appropriate systems in healthcare. EPSRC Platform Grant (EP/G004560/1, 2009-2014)
- ILHAIRE. Incorporating Laughter into Human-Avatar Interactions: Research and Evaluation (EU FP7 - FET, 2011-2014)
- Social Robots for Elderly people, funded by UCL-Crucible with P. Bentley and A. Bowling (2010-2011)
The UCLIC Affective Posture database:
The UCLIC Database of Affective Postures and Body Movements We are creating a database of affective postures and affective body movement. If you are interested in using it for academic purpose, please contact us firstname.lastname@example.org or email@example.com
Acted emotions: angry, fearful, happy, sad. The data have been collected using a VICON motion capture system.
Non acted affective states in computer game setting: frustration, concentration, triumphant, defeated. The data have been collected using a Gypsy5 (Animazoo UK Ltd.) motion capture system.
- Non acted affective states in clinical setting (not yet available for distribution).
- AffectME: Affective Multimodal Engagement
- KDIME: Kansei-based Image Retrieval Environment
Affective Computing and HRI
Design Experience 2
- MSc in HCI-E, UCLIC
- The aims of the module are to get students to develop their HCI design skills and explicitly reflect on that development. The module provides opportunities for students to develop and demonstrate:
- knowledge and skills from all course modules
- knowledge and skills of user-centred design processes
- the ability to conduct user research
- the ability to make effective use of design tools and techniques
- the ability to work as part of a team
- the ability to prepare and present a poster
and, overall, to develop the ability to justify the use of user-centred design processes and critically evaluate their contribution to the overall product. The module aims to put together skills and knowledge acquired in the separate modules of the rest of the course. The module employs a problem-based learning approach, whereby students must draw on relevant theory and methods to develop a successful and effective design for a specific user interface or human/machine system. The two-week long practical mini-project is followed by a poster presentation and by the writing of a reflective report. For details about the mini-project see the section on Project Description.
Dr Hongying Meng: multimodal emotion recognition in patients with chronic pain
Dr Mohsen Shafizadehkenari: recognition of emotion and pain level from emg signal in patient with chronic pain
Dr Harsimrat Singh: recognizing the hedonic experience of touch from EEG signals
Shakil Afzal: Facilitating emotion regulation in collaborative learning
Siti Ibrahim: Identifying affective appraisal processes in information seeking activity
Charles Ray: Improving the choice of business communications media through measurement of the impact of NVC components.
Bernardino Romera-Paredes: Emotion and pain level recognition from body movement in patients with chronic pain
Aneesha Singh: A virtual coach to motivate and assist rehabilitation session in chronic pain patients
Mirjami Alakoskela: analysis of body movement in attention to/away from pain condition
Kim Byers: emotional contagion in interactive art
Page last modified on 21 may 13 17:56 by Louise M F Gaynor