Detecting Dolphins

Dolphins, along with other marine life, communicate with each other using echolocation. These sounds, while perfectly clear to dolphins, are inaudible to human ears.

The sounds can, however, be captured on a recording, but the huge amounts of data generated through underwater marine recordings is difficult to process and interpret. A final-year software engineering student, Caleb Buchanan is using machine learning to develop processes that can detect and process dolphin echolocation clicks from recordings.

“In order to ensure that marine wildlife is not adversely impacted by open ocean aquaculture, it is essential that we are able to monitor this wildlife,” Caleb says. “Currently there are no automated systems readily available to ecologists that can process and analyse sound recordings so we can successfully track dolphins in the ocean, so we are attempting to create them.”

Caleb is working with Professor Mengjie Zhang and Dr Bing Xue from the School of Engineering and Computer Science, along with Simon Childerhouse and Ross Vennell from the Cawthron Institute and Matt Pine from the University of Victoria in Canada.

“I’ve always been interested in machine learning, and I’ve previously worked on detecting bird calls,” Caleb says. “I think it’s awesome that we can use machine learning to detect something like dolphin echolocation clicks that can’t be heard by human ears.

“I’ve learned a lot more about dolphins and other odontocetes, like killer whales, than I ever thought I would doing a software engineering degree,” Caleb says. “It’s been really cool to see all the amazing machine learning applications that are being developed now, and being involved in those developments.”

Caleb says the research has gone smoothly so far.

“Even when there are challenges, seeing our processes succeed and progress makes me forget about any minor inconveniences that have been plaguing me.”

Caleb says his work, if successful, could have many applications in conservation, as well as other fields.

“There are already examples of bird calls being detected in this way being used to manage conservation for species like the Kiwi,” Caleb says. “If we can make similar processes work for marine life this will help with monitoring the state of marine life, which can be notoriously hard to track.

“Today, AI is an increasingly important part of our lives. And Caleb’s research is a good indication of how AI can be used to address various real-world issues – for instance, it demonstrates how technology can aid in conserving wildlife, while playing a role in helping New Zealand create a Blue economy, where we look to create business models that can help enhance the current state of natural ecosystems. Depending on the results that we see from Caleb’s research, there will be opportunities to extend the application of AI to other sectors” says Professor Mengjie Zhang.

Caleb would eventually like to expand the project to include sounds produced by other marine life. He also plans to continue on to a PhD at the University.

If they can successfully use machine learning processes to analyse data on dolphin echolocation, this research could also potentially be applied to other inaudible sounds, Caleb says.

“For example, it’s theorised that transformers in power grids emit distinct sounds prior to failure that humans can’t hear,” Caleb says. “If we can detect and process these sounds using machine learning this would allow for pre-emptive replacement of these transformers, which could prevent power grid failures and power cuts.”