XClose

IOE - Faculty of Education and Society

Home
Menu

Understanding and combatting youth experiences of image-based sexual harassment and abuse

Non-consensual image-sharing practices were particularly pervasive, and consequently normalised and accepted among youth.

A hand holding a mobile phone against a pillow (Photo: TheVisualsYouNeed / Adobe Stock)

29 April 2022

Through qualitative and quantitative research on digital image-sharing practices with 480 young people aged 12 to 18 years (336 in the survey and 144 in focus groups) from across the UK a team led by Professor Jessica Ringrose (IOE, UCL’s Faculty of Education and Society), found that non-consensual image-sharing practices were particularly pervasive, and consequently normalised and accepted among youth. The research used creative and participatory methods to work with young people to excavate some of their mobile phone and social media image sharing practices through talk, drawing and screen captures. The report introduces and applies the terms image-based sexual harassment (IBSH) to describe unwanted sexual images (e.g. cyberflashing or unsolicited dick pics) and unwanted solicitation for sexual images (e.g. pressured sexting) (Ringrose et al., 2021) and image-based sexual abuse (IBSA) to describe the non-consensual recording, distribution, and/or threat of distribution of sexual images (e.g. revenge porn) (McGlynn and Rackley 2017). We aim to raise awareness of using these combined terms image-based sexual harassment and abuse to describe experiences and therefore to shift awareness. Despite not all of these harassments and abuses being illegal at present they are inherently harmful and need to be identified as such.

Key findings

Technology facilitates image-based sexual harassment and abuse

Social media platforms create opportunities for users to engage in image-based sexual harassment and abuse, through their various technical functions-referred to as ‘technological affordances’ (boyd, 2014). Snapchat was the most common platform used for image-based sexual harassment and abuse, accounting for 62% of unsolicited sexual images and/or videos, 60% of solicitation for nudes, and 33% of images being shared beyond the intended recipient. Snapchat enables image-based sexual harassment and abuse through its quick adds, shout outs, streaks, score points and lack of identity verification measures. Instagram facilitates unwanted sexual content through its direct message and group chat features. Instagram facilitates unwanted sexual content through its direct message and group chat features.

Image-based sexual harassment overwhelmingly impacts girls

First, adolescent girls often reported receiving unwanted images of male genitals (i.e. cyberflashing) from unknown adult men, and known and unknown boys (same-aged peers). In our focus groups, 75% of girls had received unwanted penis images. In our survey, 37% girls compared to 20% boys reported receiving unwanted sexual content and 80% of girls reported feeling “disgusted” and 58% felt “confused”. However, over time disgust shifted to resignation due to a process of normalisation.

“Like at first, when I first started getting dick pics I’d be like disgusted, but then I just got so used to it, and every time a dick appears on my screen I’m like – great, again.  It’s normal.” (Kathryn, Year 10 girl)

Second, girls commonly reported receiving requests for sexual images from unknown adult men and known and unknown boys (same-aged peers). Of those who had been asked to send nudes, girls felt more pressure to do so, compared to boys. In our survey 41% of girls reported having been asked to send a sexual image, compared to 17.5% of boys. Of those who had been asked to send a sexual image, 44% of girls felt pressured compared to 12% of boys. Solicitation was often initiated through being sent an unsolicited dick pic-referred to as a ‘transactional dick pic.’

Image-based sexual abuse is heavily influenced by gender norms, and an intersectional approach to contextualised harm is needed

Boys experience pressure from the homosocial peer group to obtain images:

“for boys …maybe a joke that can go around the school with other boys…saying, oh, you’re not like …man enough if you don’t have any pictures (Danny, Year 9 boy).

Boys are also rewarded for sharing girls’ images amongst their peers, as an indication of their masculinity status. Girls were shamed and victim-blamed for having their image shared without their consent due to sexual double standards. Further, IBSA risk and harms are not simply gendered but also, deeply classed and raced, with young people having variable access to support. Thus, we argue for a nuanced approach to understanding and contextualising digital sexual violence.

Young people rarely report image-based sexual harassment and abuse

Young people experienced very little relevant and useful support in mitigating these online harms. Rates of reporting to either the social media platforms or to parents or school were nearly non-existent. Just over half of participants (51%) reported doing nothing when they had received unwanted sexual content online or had their image shared without their consent.  When asked why they didn't report nearly 1/3 said ‘I don’t think reporting works’. For those who did report image-based sexual harassment and abuse 25% told a friend, 17% reported it to a platform but only 5% reported telling their parents/carers and a mere 2% reported it to their school.

Need for more effective and earlier age-appropriate digital sex education

Reflecting on their own stories, young people found that sex education was often inadequate, didn’t deal with digital contexts and started too late to counter cultures of online harassment and abuse. They offered useful insights into how education could be improved, emphasising the value of schools focusing on the actions of perpetrators and avoiding victim blaming approaches. Recognising the value of specialist expertise, smaller group formats, with younger facilitators and a move away from whole school assemblies to convey important and sensitive messages.

Young women sits on couch holding a mobile phone in one hand, and rests her other hand (elbow propped on raised knee) on her head (Photo: Evermmnt / Adobe Stock)

Recommendations for government

Provide schools with adequate funding and resources.

Ofsted inspections currently include a significant focus on whether sexual harassment is occurring in schools, without the necessary support, knowledge and training from specialists being available to schools. Safety should be a priority, and schools should be supported to focus on knowledge building, rather than meeting inspection expectations. The evidence highlights the need for schools to have the financial means to secure appropriate staffing, training and high-quality evidence-based resources to teach gender and sexual equity and tackle the roots of sexual harassment and abuse.

Allocate resources to young people’s mental health services.

Young people who experience online harms often require mental health and wellbeing support, and yet waiting lists for CAMHS are currently unacceptably long. Resources must be provided to better support young people’s mental health which is currently at crisis point.

Revise the statutory Relationships, Sex and Health Education guidance to remove victim-blaming rhetoric and better outline online harms.

The existing Relationships Sex and Health Education guidance (DfE, 2019) for secondary schools states that schools must teach students “resisting pressure to have sex” (p. 25). This causes schools to teach children that victims of coercion are responsible for changing their behaviour, rather than the perpetrator, further bolstering harmful victim-blaming narratives. All such references must be removed from the guidance.

The guidance should also make it clear that schools should teach about what constitutes online sexual harassment, including image-based sexual abuse and cyberflashing.

Recommendations for schools

Policy level

Introduce a whole-school approach to tackle sexual harassment and abuse

This includes challenging all forms of sexism, online and offline harassment, abuse and discriminatory behaviour, and embedding an understanding of gender and power relations in all aspects of school activities, including school curriculum and policies. Given that image-based sexual harassment and abuse are poorly understood, we recommend using our policy guidance to specifically address online sexual and gender-based violence (School of Sexuality Education, 2020).

Remain solution-focused and avoid victim-blaming

Students must feel confident that they will not be blamed or disciplined if they experience sexual harassment or abuse. For example, if a student sent a nude image that was shared, avoid questions such as ‘why have you done this?’, as this places emphasis on the actions of the victim. This recommendation aligns with the government guidance (DfE, 2020), “Sharing nudes and semi-nudes: advice for education settings working with children and young people”.

Implement a victim-centred approach to online and offline sexual harassment and abuse

Our research showed that young people are not reporting their experiences of sexual harassment and abuse because they fear that it will make matters worse. Young people must feel confident that their agency will be respected, and that their wishes will be taken into account regarding how the incident will be dealt with. Taking a victim-centred approach involves prioritising the victim and their needs above all else, before involving other actors, such as police and the perpetrator.

Train teachers and staff on identifying and responding to online and offline sexual harassment and abuse

In line with a whole-school approach, we recommend implementing staff training for all school staff, particularly for those who teach RSE (Relationships and Sex Education) or those with a pastoral role. Training must cover:

  • What behaviours constitute offline and online sexual harassment and abuse and who is at risk.
  • How to appropriately respond to disclosures of sexual harassment and abuse in a supportive and sensitive manner.
  • The support services available to young people who experience sexual harassment and abuse.
  • How and when to report and record incidents of sexual harassment and abuse in line with the school’s internal systems and procedures, and the lead member of staff to speak to on such matters.
  • How to communicate with parents and carers about incidents of sexual harassment and abuse, including the support services available to parents and carers.
     

In the classroom

Gender and sexual violence should be prioritised in PSHE lessons and those lessons, given they are compulsory, should be allocated time and space in school timetables.

School RSE curriculums must leave time and space in their timetables for lessons surrounding online and offline sexual violence. All students should be required to receive this education from a young age, and teachers and schools should be provided with the adequate resources and support to do so.

Deliver RSE and digital sex education in small groups by trained and confident members of staff, or external expert providers.

Research consistently shows young people are largely unhappy with the education they have received about sexuality topics such as sexting, sexual harassment, relationships and LGBTQ+ rights and that lecture-style assemblies are ineffective, and young people prefer more informal discussions in smaller groups when discussing sensitive topics (Jørgensen et. al., 2019). Our focus group participants told us that they would like RSE to be taught in an “open” way, and for them to feel comfortable to speak to the teacher leading the session. Training is vital to be able to create a safe and non-judgemental classroom environment.

Eliminate punitive and risk-focused approaches to nude image sharing.

Extensive research shows that abstinence-only approaches to sending nudes and semi-nudes are ineffective in preventing young people from sending nudes, as they are often aware of the risks involved and engage in these activities in spite of these risks. Furthermore, a risk-focused approach can be harmful, as overemphasizing risks can prevent children from seeking support when they need it, out of fear that they will be blamed for their own victimisation. Instead, teachers should acknowledge the motivations for students to engage in nude image sharing and should teach students how to make informed decisions about if, when, how, and with whom to sext consensually and responsibly.

Focus on preventing perpetration of image-based sexual harassment and abuse.

Rather than teaching students to avoid sending nudes, which places the onus on the victims, students should be taught not to sexually harass and abuse others online. This reorients the focus away from the actions of the victim, and towards the actions of the perpetrator. Given our finding that the majority of perpetrators are boys and men, boys especially should be taught in school how to identify and challenge masculinity norms and unequal gender relations. Students should be taught to reflect on their role as participants or bystanders (especially in group contexts of image sharing), and their responsibility to prevent and intervene in incidents of online and offline sexual violence—when it is safe to do so.

Recommendations for child welfare professionals

Provide a consistent approach to sexual harassment and abuse.

Safeguarding children’s boards and other welfare professionals must work in partnership with schools, and ensure practices align with school policies and procedures, to provide a consistent approach for the young people in their care.

Prioritising children’s rights when safeguarding.

Children’s rights must be centred in responding to safeguarding concerns that relate to offline and online sexual harassment and abuse. This includes being transparent with a child about a safeguarding process; ensuring that any conversations and disclosures about a young person’s involvement in or experience of online sexual harassment are had with the knowledge of the young person, and, if possible, in their presence; and, within the boundaries of safeguarding best practice, take into account the wishes of the victim of sexual harassment.

Recommendations for parents and carers

Avoid taking an overly negative and disciplinary approach to your child's technology use.

Research shows that children often avoid telling their parents about their experiences of sexual harassment and abuse because they worry about being punished or having their technology taken away. Crucially, children must feel that if they experience online abuse, they will not be in trouble if they confide in an adult.

Be non-judgemental and supportive towards your child’s online activities.

For many young people the traditional online/offline binary no longer exists. It is essential that parents and carers understand that for teens today, digital communication is an inextricable part of their lives. This includes forming friendships, romantic and sexual relationships, developing their identity and their understanding of the world around them. Rather than focusing efforts on restricting social media or mobile phone use, we recommend cultivating a trusting and honest relationship around online activity, so that children feel confident to speak openly about experiences online that make them feel uncomfortable, weird, upset or angry. Parents and carers should try to understand the underlying motivations of children to put themselves in more vulnerable situations. For example, they may have sent an image for fun, or to a boyfriend or girlfriend.

Recommendations for tech companies

Snapchat should maintain a record of images, videos and messages.

Reporting on Snapchat was deemed useless because the images disappeared. In response, Snapchat should be urged to maintain a record of the images, videos and messages that are sent through the app, in order to identify perpetrators and facilitate easier reporting of incidents of image-based sexual harassment and abuse.

Create clearer and more extensive privacy settings.

Children and indeed, adults, find that navigating the privacy settings on popular social media platforms (Instagram, Snapchat, Tiktok) is challenging. To keep young people safe online, the way to manage these settings must be made clearer. Furthermore, social media platforms should set their default privacy settings to offer the maximum (rather than minimum) protection and privacy for the user—especially for younger users.

Create more rigorous identity verification procedures.

Our findings showed that a large percentage of incidents of image-based sexual harassment and abuse were from unknown adult men—many of whom had created false identities online. Social media platforms have a responsibility to protect children from these adult predators and should do so by putting more effective measures in place to prevent users from creating false identities online. This could involve verifying a user’s identity using a passport or government ID, as well as putting a user’s verified age on their profile.

Develop innovative solutions to prevent image-based sexual harassment and abuse and improve reporting functions.

If children report the sharing of a child sexual image to the social media platform, removing these images can be inefficient and there is nothing to stop the image from being screenshotted and re-uploaded. Across our qualitative and quantitative research, participants claimed that they did not report incidents of image-based sexual harassment and abuse to the social media app because they did not think that reporting works. Social media platforms must consequently improve their responses to these serious forms of digital sexual violence.

Social media platforms should work with child e-safety platforms to improve the online safety of young people.

There are numerous child e-safety platforms and support services such as Report Remove, Report Harmful Content, CEOP and the Internet Watch Foundation which aim to prevent and support victims of image-based sexual abuse, online sexual harassment, grooming and child sexual exploitation.

However, it is still difficult and inefficient to remove nudes shared nonconsensually on popular social media apps (e.g. Snapchat) or private messaging apps (e.g. WhatsApp). An integrated approach between young people’s online safety services and social media companies could be a highly effective solution to keeping children safe online.

(facebook button)