XClose

UCL Department of Science, Technology, Engineering and Public Policy

Home
Menu

Lilly Neubauer - PhD candidate

Can you briefly describe what your research project is about?

My research is looking at the digital traces of domestic abuse, specifically of "Coercive Control". People not in the domestic abuse field haven't always heard of coercive control, but, broadly speaking, it's an offence that aims to criminalise patterns of abusive, threatening and controlling behaviour which often appear in abusive relationships, but which may not in the past have themselves constituted an offence. Examples of this are things like constantly checking on someone's whereabouts, repeatedly putting them down, isolating them from friends or family, making threats of physical violence or humiliation, and many others. These kinds of manipulation and psychological abuse often underpin other abusive behaviours, and can themselves have a very damaging impact on victims.

Coercive control is an offence, but unfortunately, it is quite rarely prosecuted because it's difficult to evidence and because many people, even within the criminal justice system, don't have the tools to understand it. A lot of evidence for coercive control can be found on people's digital devices. I'm trying to understand what evidence is available and how we can use computational tools to analyse this evidence. Specifically, I'm hoping to look at using natural language processing to interpret text-based evidence. But whether or not I can do this will be very dependent on me getting the right kind of data!

How is it different from other research projects in the topic? 

Alongside the rest of the tech abuse team at STEaPP, our work understanding the prevalance and impact of tech abuse (essentially, the use of technology in intimate partner violence) is something that not many people are working on. Tech abuse has long been a relatively ignored part of the domestic abuse landscape - although luckily, and perhaps impacted by the COVID-19 pandemic, this seems to be changing. From a more individual perspective, there hasn't yet been much research applying natural language processing to the topic of domestic abuse in general, because most abusive behaviour takes place "behind closed doors" and so the data to train NLP models is largely inaccessible. I hope to use some creative methods, as well as UCL's great connections, to make some progress on this problem.

What do you find exciting about this project?

This research is interesting to me because it combines knowledge from across many different disciplines, and it also has (hopefully!) big potential for real-world impact. In order to look at coercive control, we have to consider lessons from psychology and linguistics (to understand the mechanisms which perpetrators use to control) as well as law, criminology and evidence (to understand how these behaviours are translated into and interpreted as offences). Then of course, when looking at data and machine learning, we need to use computer science and mathematical techniques and principles. And finally, we have to understand the policy and ethical implications of what we're finding, which includes discussions about gender, about tech abuse from a sociological perspective, and about the context of the real-world services which engage with perpetrators and victims of abuse (such as domestic abuse advocacy groups, the police, the courts). It's a problem that sits at a fascinating intersection of many different bodies of knowledge.

Furthermore, domestic abuse is a very widespread issue and one that has a huge impact on victims' lives. The work that the tech abuse team here at STEaPP are doing is so important because it has the potential to actually make it easier for people who are experiencing, leaving or recovering from abusive relationships.

What are you working on now to prepare for the next stage of the project?

Currently, I am focusing on refining my research question, which mostly centres around getting my hands on data! As with all data science projects, I know the research outcomes of this project will only be as good as the data I can get a hold of... and in this particular area, data is going to be very difficult to come by! There will be lots of tricky ethical and practical questions I have to grapple with when trying to get a hold of and work with victim data. I'm looking to engage with stakeholders who have access to data (such as advocacy organisations and other academic institutions), but I'm also exploring other routes to look at public social media data as a contingency.