Dailymaverick logo

South Africa

South Africa, Our Burning Planet

How scientists are using AI to eavesdrop on dolphins and estimate population size

How scientists are using AI to eavesdrop on dolphins and estimate population size
Sasha Dines (left) records dolphin acoustics with SeaSearch colleagues in False Bay. (Photo: SeaSearch)
Marine scientists are using artificial intelligence to create innovative and efficient ways to determine populations of dolphins and improve their understanding of the conservation needs of species under threat.

“Could we use signature whistles as an individual marker to count animals along the coastline?” 

This was the question Sasha Dines, a marine biologist who is currently completing her PhD at Sea Search Research & Conservation and Stellenbosch University, wanted to find out.

Dines was speaking at the Plett Marine Science Symposium last month, sharing her research into the acoustics of endangered humpback dolphins in South Africa.

Humpback dolphins, also known as Indian Ocean humpback dolphins, inhabit the coastal waters of South Africa – and they can be very difficult for researchers to monitor. Traditional methods, such as photo identification of dorsal fins, are time consuming and often impractical for a species that lives in shallow, turbulent waters. 

A study, led by the SouSA Consortium, collated 16 years worth of all available photo ID data for humpback dolphins in South Africa, finding that there are fewer than 500 individuals, which Dines emphasises is a very small number, and puts them up there with the most endangered cetacean species resident in our waters.

Indian Ocean humpback dolphins in De Hoop Marine Protected Area in the Western Cape. (Photo: SeaSearch)



“Understanding how many individuals there are and if that number is going up or down is the major question that we have, and will inform other conservation practices and policy,” said Dines. “But unfortunately, photo ID is just a really inefficient method in such an elusive dolphin species.”

So they needed a different way to collect more data – and this is where bioacoustics came in.

Dines explained that dolphins make three types of sounds; burst pulses, echolocation clicks, and whistles. Dines’s research is focused on whistles, specifically dolphins’ signature whistles – which are unique to each individual. 

These whistles are a learned, individually distinctive whistle type, that broadcasts the identity of the individual to its surroundings and are formed as a calf, and that whistle will stay the same throughout the dolphin’s life.

“You can kind of call it a name. But unlike humans, where we call each other by each other’s names, dolphins will just call out their own name,” said Dines.

“So it would be like me shouting ‘Sasha’ every time I walk into a room. And you’ll respond with your own name.”

But for researchers, these sounds can serve as acoustic fingerprints, allowing researchers to identify and track individual dolphins over time. 

Read more in Daily Maverick: A brief history of AI: how we got here and where we are going

In lieu of having photo ID available for humpback dolphins, Dines went through more than seven years of boat-based acoustic recordings (where you drop a recording device, a hydrophone, from the side of a boat, and record the animals calling in the vicinity), and was able to identify about 25 signature whistles for the humpback dolphins along the south coast and Richards Bay. 

But this was seven years of boat-based data, and it wasn’t going to tell Dines a lot about the population in real time.

Sasha Dines (left) records dolphin acoustics with SeaSearch colleagues in False Bay. (Photo: SeaSearch)



So in 2022, rather than recording from the boat, Dines attached hydrophones to moorings and positioned them along the coastline of the south coast from Mossel Bay to Plettenberg Bay – “hoping that dolphins would pass and hoping even more that they were chatty. And then not just chatty, but that they were calling out their own names,” she said.

These devices capture the sounds of the ocean continuously for weeks or even months at a time. She left these devices recording continuously for 42 days, capturing more than 5,000 hours of acoustic data, which is far too much data for a single person to process manually – and this is where AI (artificial intelligence) comes in. 

Dr Giu Frainer, a post-doc within SeaSearch, helped to develop machine learning algorithms to identify humpback dolphin whistles from these long-term recordings, allowing Dines to identify specific signature whistles within these encounters, adding to the first comprehensive acoustic catalogue of humpback dolphins in the region.

The results were promising. In just 42 days, the moorings captured whistles from 26 individual dolphins at the three sites, demonstrating the efficiency and effectiveness of the method.

One of the challenges Dines faced was that the waters off South Africa are home to multiple dolphin species, including bottlenose dolphins and common dolphins, which also produce whistles. The differences between these whistles are often so subtle that even an experienced researcher can struggle to tell them apart. The AI algorithms, however, can detect minute variations in frequency and pattern, allowing them to accurately classify the species.

This approach not only reduces the need for direct human interaction with the dolphins, but also provides a more cost-effective and time-efficient method that can inform conservation efforts.

“The really interesting part of this is when we can do repeat deployments and see if that number of individuals is going up or down,” said Dines, referring to using AI as a tool for conservation.

“And that will tell us what kind of state this population is in – if it’s in recovery or if sadly it’s declining, and then at what speed.” DM

Daily Maverick’s journalism is funded by the contributions of our Maverick Insider members. If you appreciate our work, then join our membership community. Defending Democracy is an everyday effort. Be part of it. Become a Maverick Insider.