UCL Grand Challenge of Intercultural Interaction
- About Our Work
- Small Grant Activities
- Grand Challenges Student Fund Project, 2012-13
- Small Grants 2013
- Transnational Slade
- Ideas of African sculpture in archaeology and art in modern Britain: Jacob Esptein, Flinders Petrie, Ronald Moody and Edna Manley
- John Donne’s Conversions, 1613–2013
- Coordination and Collaboration
- Trust and Distrust in the Eastern Bloc and the Soviet Union, 1956-1991
- Increasing Awareness of Organ Donation in Black and Minority Ethnic Groups
- Research Expertise
- Getting Involved
- Contact Us
- What can a Grand Challenges Small Grant achieve for you?
- UCL Grand Challenge Evaluation Survey
Click below to share this pageTweet
Published: Dec 18, 2014 12:00:07 AM
- Shape the future of UCL's Grand Challenges Programme: complete our evaluation survey and we will donate £50 to Pathway (the homeless health charity) for every 100 completed surveys
- Watch the short film Celebrating the Grand Challenges
- UCL researchers: Why contribute to The Conversation?
- Building Virtual Transcontinental Student Links supported via Grand Challenges Student Fund
Coordination and Collaboration
- Lead applicant: William Steptoe (UCL Computer Science)
- Main collaborator: Dr Daniel Richardson (UCL Cognitive, Perceptual and Brian Sciences)
When two people collaborate, they become more like each other. They sway their bodies, chose their words, wave their hands and move their eyes in concert. This is termed this ‘behavioural coordination’, but there is no clear understanding of why it happens or what it produces. In this project, we intended to use state of the art technology, firstly, to quantify multiple channels of coordination in a natural social interaction, and secondly, to control the behavioural coordination experienced by people in virtual reality interaction.We will investigate participants of European, Asian and African origin to capture 'rules' during face to face interaction including eye contact, nodding, and amount of facial mimicry.
We will quantify how people of different cultures move their faces, and what effect this has on intercultural communication and the impressions people form of each other. By replaying and manipulating these recorded interactions in virtual reality, we can then test experimental predictions: for example we can generate an avatar representing a Chinese participant, but animate it in a more 'western' manner, thereby increasing or decreasing certain facial motions so that a speaker conforms to cultural norms of the listener. In this way, we can develop tools to foster intercultural communication.
Page last modified on 15 apr 13 16:00