UCL Division of Biosciences


Analysis: A new AI tool to help monitor coral reef health

6 June 2024

PhD candidate Ben Williams (UCL Centre for Biodiversity and Environment Research and ZSL’s Institute of Zoology) writes with a colleague about why they built SurfPerch, an AI led system to make it faster and easier for marine scientists to answer ecological questions.


Coral reefs cover only 0.1% of the ocean's surface — yet they host 25% of all known marine species. It is critical that we greatly scale-up our efforts to monitor, manage, protect and restore reefs around the world that are in crisis as a result of threats such as overfishing, disease, coastal construction or heatwaves.

Emerging research demonstrates how ecoacoustics — that is, the natural sounds that characterize an ecosystem – can help to gain a better understanding of reef health. Over the past year we asked people from around the world to participate in “Calling in our Corals” — a project created in collaboration with Google Arts & Culture that invites the public to listen to reef audio recordings to build a bioacoustic data library on the health of reefs. Today we're taking a further step and introducing a new AI-powered tool called SurfPerch created with Google Research and DeepMind that can be applied to automatically process thousands of hours of audio to build new understanding of coral reef ecosystems.

Why listening to coral reefs matters

By listening to the diversity and patterns of behavior of animals on reefs, we can hear reef health from the inside, track activity at night, and even survey reefs in deep and murky waters. Yet analyzing the countless hours of underwater sounds has been a manual process and something that scientists are unable to keep up with. This is why we’re excited about the work we have done with Calling in our Corals, bringing together marine biologists, creatives, programmers and citizen scientists to monitor health, assess biodiversity, identify new behaviors and measure restoration success.

From a listening collective to a trained AI model

Last year, visitors to Calling in Our Corals listened to over 400 hours of reef audio from coral reef sites around the world. The members of this open listening collective had to click when they heard a fish sound. This brought thousands of eyes and ears to data that would take bioacousticians months to analyze. The results provided a wealth of fascinating new fish sounds that we’ve been using to fine tune SurfPerch. SurfPerch was trained and rigorously tested to produce a model that can quickly be trained to detect any new reef sound using just a handful of examples. This allows us to analyze new datasets with far more efficiency than previously possible, removing the need for training on expensive GPU processors and opening new opportunities to understand reef communities and conservation of these.

An exciting discovery we made along the way was that we were able to significantly boost our model's performance by utilizing the large diversity of bird recordings out there. Despite sounding very different, there were enough common patterns between bird song and fish sounds for the model to learn from one to improve performance for the other.

From a lab experiment to real-world insights

Our first trial combining Calling in Our Corals with SurfPerch has already revealed a difference between protected and unprotected reefs in the Philippines, restoration outcomes in Indonesia, and relationships with the fish community on the Great Barrier Reef.

The best part — you can still help by listening to brand new audio on Calling in Our Corals to help further train the model.

To learn more about our work supporting reef restoration visit https://www.buildingcoral.com.


Ben Williams

Ben Williams co-wrote this article with Professor Steve Simpson (University of Bristol), and it was originally published by Google.



  • Top: Ben Williams installing a hydrophone in a coral reef. Credit: Tim Lamont, Lancaster University
  • Bottom: Ben Williams conducting analysis of the ReefSet data.