
Google has announced a new AI tool that utilises ecoacoustics to aid marine biologists in gaining a better understanding of coral reefs.
The tool, SurfPerch, developed by Google Research and DeepMind, is designed to automatically process thousands of hours of audio for bioacoustic data.
Ecoacoustics enables scientists to monitor the health of coral reefs by analysing the diversity and patterns of animal sounds. This method allows for tracking nocturnal activity and surveying reefs in deep or murky waters.
The project originated from Google’s “Calling in our Corals” crowdsourcing initiative, where users worldwide listened to over 400 hours of reef sounds and identified all fish sounds. This accelerated the data analysis process by several months and created a data library of fish sounds that further trained the AI tool.

Since incorporating SurfPerch, researchers have discovered differences between protected and unprotected reefs in the Philippines, tracked restoration outcomes in Indonesia, and better understood relationships within the fish community on the Great Barrier Reef.
This breakthrough enables efficient analysis of new datasets without the need for expensive GPU processors, making it easier to shed light on the mysteries of coral reefs.
Google has announced the addition of new audio to the Calling in Our Corals website to continue the project.
Comments