XClose

UCL Research Domains

Home
Menu

Children and unsolicited sexual images on social media apps

2019-20 Social Science Plus Pilot Project (£11,000)


Research question
This project seeks to combine qualitative sociology methods with quantitative and experimental computer science research design to better understand how children are navigating the rise of unsolicited sexually graphic content on the social media applications
Snapchat and Instagram, which have very high use amongst adolescent children. Our methods are designed to help us trace patterns in unsolicited content to locate the origins and to test the vulnerability of young people’s social media profiles on these apps. We will also design lesson plan content to respond to and combat online risk in these platforms:

Qualitative
What are children’s experiences of receiving unsolicited graphic images on Snapchat and Instagram?
What are young people’s understandings of unsolicited content do they know how to block
and report the offending content?

Quantitative
What are the patterns of unsolicited image content that emerge from 100 children’sSnapchat ‘Quick Ad’, ‘Blocked’ and Instagram ‘Message Request’ folders?

Experimental
What vulnerabilities are apparent on 100 Children’s Snapchat accounts relating to being contacted by unknown sender, being sent unsolicited images, and being asked to share information or images

Focus, rational and societal relevance
The proposed research will explore the functions of Snapchat and Instagram for enabling
unsolicited sexual messages targeted at children. The research responds to heightened
policy attention and concern over youth use of social networking platforms and privacy and
data risks. The geolocational capacities of Snapchat are said to create significant risk for
children and there is growing concern over young people adding contacts that they do not
know on platforms like Instagram enabling phishing, grooming and the spread of unsolicited
sexual content (NSPCC, 2018). Recent qualitative research from Professor Ringrose (2019)
found a vast majority of young people aged 11-15 have received unsolicited sexual images
or videos through the top two social media apps in use by UK youth: Snapchat and
Instagram. The majority of content was from unknown users through platform affordances of
‘quick ads’ and ‘mentions’ on SnapChat and ‘message requests’ on Instagram. Young
people lacked understanding of where the content came from (such as robots or algorithms)
or why it was sent (phishing, hacking, data mining). They also often had insufficient
knowledge of how to block or report the senders to the social media companies or whether
or not it was a crime or abuse or to report it at school. The research was limited, however,
because it was not possible to actually trace the origins of the content as the applications
own the IP addresses. This research seeks to explore children’s experiences of unsolicited
sexual messages, as well as map patterns in the messages received, and discover how
vulnerable specific young people are to receiving this content. Our findings will help us to
better understand the origin of this content, but also critically will help us to address this
growing problem through better youth digital literacy.

Research design and methodology
Qualitative workshop sessions will be conducted with up to 100 students aged 11-18 across five London schools. These schools will be selected on the basis of availability via our external partner the sex education charity Sexplain, but we will aim for diverse contexts. We will explore young people’s experiences and understandings of unsolicited sexual content on their social media apps, asking young people to demonstrate how they use Snapchat and Instagram and show us how they respond to these messages. Through our questions and activities, we will map which technological affordances make the young people vulnerable and their awareness of these risks and harms.

During the focus groups we will also collect the offending material from the Snapchat quick ad and blocked folders, and Instagram message request folders of up to 100 young people. This digital database will contain legitimate messages and unsolicited sexual image content, which can be analyzed to detect patterns. These patterns may lead to the creation of automated tools to detect these unsolicited messages. Moreover, these analyses can lead to recommendations to users and social network administrators on how to prevent these messages and mitigate the effectiveness of malicious campaigns.

Next, using the results of the social media message analysis, we will identify possible attack vectors that may affect the young users on the social networks and evaluate how effective these attacks are by simulating an attack scenario and trying understanding how
cybercriminals may exploit the vulnerabilities of how Snapchat and Instagram work. This will give us the opportunity to advise the social media platforms on better protective measures as well as help design a new social media digital literacy lesson plan for schools with our external partner the Sex Education Charity Sexplain.

Research team
Social Scientist Principle Investigator
Professor Jessica Ringrose,Professor Sociology of Gender and Education, Education, Society and Practice, Institute of Education

Non-Social Science Co-Investigator
Dr. Enrico Mariconti, Lecturer,Department of Security and Crime Science, Engineering Sciences, BEAMS 

Additional collaborator
Sexplain is an award-winning relationships and sex education charity that delivers staff training and student workshops in over 75 schools across the UK. Sexplain has a track-record of innovation and best practise in RSE, winning the Pamela Sheridan Award in 2018. Professor Ringrose has an established working relationship with Sexplain and is currently collaborating with the charity to develop teacher training lesson plans for the new Department for Education Relationship and Sexuality Education Curriculum, which is becoming statutory in the UK in 2020. The digital literacy lesson plan that we are developing responding to our project findings will become a part of Sexplain’s suite of training and lesson plan resources.