Social media algorithms amplify misogynistic content to teens
5 February 2024
Social media algorithms amplify extreme content, such as misogynistic posts, which normalises harmful ideologies for young people, finds a new report led by a UCL researcher.
The research, conducted in partnership between UCL, the University of Kent and the Association of School and College Leaders (ASCL), found a fourfold increase in the level of misogynistic content in the “For You” page of TikTok accounts over just five days on the platform, in an algorithmic modelling study.
Through interviews with young people and school leaders, the researchers also found that hateful ideologies and misogynistic tropes have moved off screens and into schools, becoming embedded in mainstream youth cultures.
The report authors stress the need for a “healthy digital diet” approach to education to support young people, schools, parents and the community at large. They also say it is essential to champion the voices of young people themselves, particularly to include boys as part of discussions regarding online misogyny, and they suggest a “peer-to-peer” mentoring approach.
Principal investigator Dr Kaitlyn Regehr (UCL Information Studies) said: “Algorithmic processes on TikTok and other social media sites target people’s vulnerabilities – such as loneliness or feelings of loss of control – and gamify harmful content. As young people micro dose on topics like self-harm, or extremism, to them, it feels like entertainment.
“Harmful views and tropes are now becoming normalised among young people. Online consumption is impacting young people’s offline behaviours, as we see these ideologies moving off screens and into school yards.
“Further, adults are often unaware of how harmful algorithmic processes function, or indeed how they could feed into their own social media addictions, making parenting around these issues difficult.”
The researchers began the study by interviewing young people engaging with and producing radical online content. This then informed the algorithmic study in the creation of archetypes, to represent typologies of teenage boys who may be vulnerable to becoming radicalised by online content. The researchers set up accounts on TikTok for each archetype, with distinct content interests typical of these archetypes (for example, seeking out content on masculinity or addressing loneliness), and researchers used these accounts to watch videos that TikTok suggested in its “For You” page, over a period of seven days.
Initial suggested content was in line with the stated interests of each archetype, such as with material exploring themes of loneliness or self-improvement, but then increasingly focused on anger and blame directed at women. After five days, the TikTok algorithm was presenting four times as many videos with misogynistic content such as objectification, sexual harassment or discrediting women (increasing from 13% of recommended videos to 56%).
The research team led roundtables and interviews with school leaders, who attested that misogynistic tropes are becoming normalised in how young people interact in person as well.
The researchers set out the following recommendations:
- Holding social media companies accountable, and applying pressure on them to address the harm caused by their algorithms and prioritise the wellbeing of young people over profit.
- Implementing “healthy digital diet” education, which considers different types of screen time and digital content young people are engaging with, akin to different food groups, considering how much of it is consumed, how it can become “ultra-processed” due to algorithms, and potential impacts on mental and physical health.
- Peer to peer mentoring, empowering older pupils to work with their younger peers, and helping to involve boys in discussions around misogyny.
- Promoting wider awareness of algorithmic processes among parents and the community at large.
The researchers say that their research may apply similarly to other social media platforms, supported by other groups’ research on Instagram and YouTube, for example, while social media algorithms can also favour other harmful content types such self-harm material or extreme ideologies. Previous research involving Dr Regehr and UCL colleagues has found sexual violence online is routinely experienced by women and girls, which has further increased in recent years. This work fed into new digital flashing legislation in 2022.*
Geoff Barton, General Secretary of the Association of School and College Leaders, commented: “UCL’s findings show that algorithms – which most of us know little about – have a snowball effect in which they serve up ever-more extreme content in the form of entertainment. This is deeply worrying in general but particularly so in respect of the amplification of messages around toxic masculinity and its impact on young people who need to be able to grow up and develop their understanding of the world without being influenced by such appalling material.
“We welcome the call to involve young people, particularly boys, in the conversation to combat this problem together with their peers and families. We call upon TikTok in particular and social media platforms in general to review their algorithms as a matter of urgency and to strengthen safeguards to prevent this type of content, and on the government and Ofcom to consider the implications of this issue under the auspices of the new Online Safety Act. It’s time for action rather than yet more talk of action.”
* Two studies co-led by UCL’s Professor Jessica Ringrose: UCL News, 2021: Young peoples’ rates of reporting online harassment and abuse are ‘shockingly low’ (see more: IOE research informs Online Safety Bill); UCL News, 2024: UK teens experience spike in online harm during Covid-19 pandemic
Links
- Report: Safer scrolling: How algorithms popularise and gamify online hate and misogyny for young people
- Dr Kaitlyn Regehr’s academic profile
- UCL Information Studies
- Media coverage
Image
Source: iStock
Media contact
Chris Lane
tel: +44 20 7679 9222 / +44 (0)7717 728 648
E: chris.lane [at] ucl.ac.uk