GCDC organises workshop on the Online Safety Act 2023, bringing together lawyers, NGOs, regulators, and scholars from a range of disciplines.
10 June 2025

On Tuesday, 10 June, the Global Centre for Democratic Constitutionalism (GCDC) organised a workshop on the Online Safety Act 2023 (OSA), bringing together a group of lawyers, NGOs, regulators, and scholars from a range of disciplines at UCL and beyond. In her opening remarks, Erin Delaney (UCL Laws, GCDC) emphasised the workshop’s aim of fostering an interdisciplinary approach that bridges the gap between scholarship, policy, implementation, and practice. She stressed the importance of identifying impactful areas for future research, particularly given the significance of digital regulation for the future of constitutional democracy.
Following Delaney’s opening remarks, Bernard Keenan (UCL Laws, GCDC) gave a brief overview of the OSA. He explained that the Act is intended to make the internet safer for individuals in the UK, with a particular focus on children. Whereas service providers were previously immune from liability unless they had specific knowledge of harmful content, the new regime introduces a more systemic model of regulation. The UK Act comes against the context of a broader global trend towards increased online regulation.
In the first panel, ‘Regulating Content in a Democracy’, Ricki-Lee Gerbrandt (UCL Digital Speech Lab) discussed the implications of the OSA for media freedom and journalist safety, raising concerns about the insufficient protections for journalists under the Act. Additionally, Eliza Bechtold (Bonavero Institute of Human Rights) highlighted the broader free speech concerns raised by the Act, pointing in particular to provisions such as section 17 on ‘content of democratic importance’ and section 179 on the offence of false communications. Bernard Keenan concluded the panel by reflecting on the philosophical and historical foundations of freedom of speech. He contended that the OSA reflects a contemporary recognition of new kinds of harm – to minds, bodies, and society – enabled by modern communication systems.
In the second panel, ‘Technological Governance in the OSA’, Graham Smith (Bird & Bird) opened with a discussion of ‘safety by design’, which platforms are obliged to ensure under the OSA, even though the Act does not explain what the concept means. Jessica Shurson (University of Sussex) then addressed the OSA’s technology notices under section 121, pointing out that they could weaken end-to-end encryption and thus create security risks that are systemic. Meanwhile, Mark Warner (UCL Computer Science) presented findings from his research on online dating platform safeguards, pointing out the unique risks associated with these platforms in part due to the nature of their business model that aims to move users offline.
The third panel, ‘Harm and the Protective Aims of the OSA’, saw Lorna Woods (University of Essex) explore ‘safety by design’ in the context of the OSA. She posited that the concept points to features integrated throughout a platform’s operating system, rather than merely to retrospective content moderation. Kaitlyn Regehr (UCL Digital Humanities) then presented her research on how misogynistic content on TikTok affects young boys, highlighting the effectiveness of a school-based intervention model centred on peer-to-peer learning and mentorship as a way to address the problem. Closing the panel, Jeffery Howard (UCL Digital Speech Lab) suggested that to meet their OSA obligations, platforms will likely utilise a ‘bypass strategy’, adopting broader community standards that prohibit more content than is required by law, which poses a threat to freedom of expression. Howard also discussed the likelihood of platforms using user behaviour data to intervene pre-emptively, before harm occurs, and the ethical implications of doing so.
The fourth panel, ‘Regulation, Oversight, and the OSA in Practice’, opened with Jonathan Hall KC (Independent Reviewer of Terrorism Legislation), who emphasised the challenge of understanding and implementing the OSA, warning that its impact will depend heavily on Ofcom’s actions. In light of this, he called for more public-facing communication from Ofcom, including, for example, short guides for parents. Neil Brown (Decoded Legal) focused on the burden the OSA places on small and low-risk services, which differ significantly from big tech platforms like Meta, TikTok, and Google. He argued that the Act’s uncertainty and heavy regulatory demands could have the undesirable effect of squeezing out small platforms, thereby shrinking service plurality.
Also on the fourth panel, John Higham (Ofcom) outlined the steps Ofcom has taken to implement the OSA while acknowledging the complexity and moral weight of these regulatory decisions, not least given the regulator’s public law duty to avoid unduly interfering with fundamental rights such as privacy and freedom of expression. Meanwhile, Jim Killock (Open Rights Group) outlined the OSA’s shortcomings, including numerous incentives for rapid takedowns without corresponding incentives to ensure that takedown decisions are accurate. He also criticised the OSA for failing to address the underlying issue of service providers operating business models driven by attention and profit, and instead advocated for alternative approaches such as open moderation systems.
After the substantive panels, participants engaged in a vibrant plenary discussion on the OSA’s scope, implications, and implementation mechanisms. The workshop was followed by the launch of Bernard Keenan’s Interception: State Surveillance from Postal Systems to Global Networks (MIT Press, 2025), which offers a media genealogical account of how the UK and US governments have surveilled citizens by intercepting their private communications. The book launch was chaired by Michael Veale (UCL Laws, GCDC), with Daniella Lock (King’s College London) and Paul Scott (University of Glasgow) as commentators.