Brain Sciences


Learning to keep track of what others think

16 June 2020

The brain can be trained to learn whether a thought is 'mine' or 'yours', according to an IoN study pinpointing the specific neural circuits that enable people to keep track of other people’s beliefs.

people socialising

The findings, published in Nature Communications, show that the ability to put oneself into another person’s shoes is flexible, not fixed, and can be improved with training or social connections.The researchers say their findings may help to explain why people can assume that someone close to them (e.g. a partner) already knows something they know, as the distinction between our own thoughts and others’ can get blurred.

Lead author Sam Ereira said: “In order to meaningfully put yourself into someone else’s shoes, your brain needs to distinguish between your actual thoughts and your simulation of the other person’s thoughts.”

A previous study showed that when you simulate someone else’s beliefs, your brain represents the beliefs of ‘Self’ and ‘Other’ in distinct neural pathways. If your brain represents these beliefs very distinctly, i.e. there is little overlap between 'self' and 'other' beliefs, then you’ll be better at discriminating your own beliefs and the other person’s.

In this new study, they wanted to see whether the degree of ‘Self-Other’ overlap in the brain could change with cognitive training. To test this, 40 participants were asked to play a computer game, where they observed a series of coin flips. The coin was biased, and the amount of bias of the coin changed slowly throughout the task. For example, at the beginning of the game, the coin might be biased to show up ‘heads’, but by the end, it might be more likely to show up ‘tails’. 

Additionally, each participant was partnered with a second participant, who played the game at the same time. However, the partner could not see all of the coin flips and was also shown some additional misleading coin flips. This meant that the partner always had a false belief about the bias of the coin. The participants were also asked to keep track of their partner’s false belief, as it slowly changed throughout the game.

A ‘predictive processing’ model was then used to describe what was going on in a participant’s mind as they played the game. Firstly, the model described how participants changed their own belief every time they saw a coin flip. Secondly, it described how they changed their estimate of their partner’s belief, every time the partner saw a coin flip. The model works by calculating ‘predictions’ and ‘prediction errors’, for example, if the participant predicts that there is a 90 per cent chance the next coin flip will be heads, but then the coin comes up tails, the participant will be surprised. The model uses this prediction error to improve the participant’s prediction for the next coin flip.

Every participant played the game twice. In the first game, they played with a partner who saw a lot of the same coin flips as them. In the second game, they played with a different partner who mostly saw different coin flips to them. The next day the participants were brought back and asked to play the game again with both partners but this time there was no difference between the two games (i.e. both partners saw the same amount of coin flips that matched what the participant saw). The authors could then see whether participants had learned anything, on the first day, about the relationship between their own thoughts and the partner’s thoughts that would affect their ability to distinguish between Self and Other on the second day. 

Remarkably they found that on the second day participants struggled to distinguish between their own beliefs and the first partner’s beliefs (with whom they had seen mostly the same coin flips on day one). However, participants found it easy to distinguish between their own beliefs and the second partner’s beliefs (with whom they had mostly seen differently coin flips on day one). 

The authors also scanned the participants’ brains using functional magnetic resonance imaging (fMRI). When a person experiences a prediction error it’s because of a particular pattern of activity in their brain. In this study, the scans revealed that the brain activity seen when a participant experienced a prediction error had merged with the activity seen when estimating the prediction errors of their first partner. However, this activity had become distinct from that of the second partner.

This reveals new information about prediction errors, which are fundamental learning signals in the brain. Not only do they contain information about the subjective sense of 'self', but also their neural circuits in the brain can change with experience.

“These findings may explain familiar experiences. For instance, when two people spend a lot of time together, they may start to feel like they’re the same person. Someone might accidentally adopt their partner’s preferences, memories or beliefs, or read something in a book and wrongly assume that their partner has also acquired this new knowledge, “ said Ereira. 

“People struggle to understand other people’s beliefs and intentions in many mental illnesses, but our findings show that this is something the brain can learn to do. This opens up the possibility of training people to engage in social cognition in a flexible, adaptive way, and may help us to understand how and why talking-based therapies can be so effective,” he added.