AI Helps Scientists Decode Dolphin Communication, Earning $100,000 Prize
A Florida research team won $100,000 for decoding dolphin whistles, showing these sounds act like words. AI and deep learning will further analyze their complex communication.

Dolphin Communication Research Earns $100,000 Coller Dolittle Challenge Prize
A team of scientists studying Florida's dolphin communities has won the inaugural $100,000 Coller Dolittle Challenge prize, which rewards advances in interspecies communication algorithms. Led by Laela Sayigh from the Woods Hole Oceanographic Institution, the researchers used non-invasive hydrophones to collect dolphin whistles, revealing evidence that these sounds function like words shared across groups.
The team identified specific whistle types: one acts as an alarm signal, while another is used in response to unexpected or unfamiliar events. This research adds weight to the idea that dolphin communication is structured and meaningful.
Advancing Decoding with AI
Capturing the sounds marks only the start of the work. The researchers plan to apply AI and deep learning techniques to analyze whistle patterns further. Jonathan Birch, a London School of Economics professor and judge for the prize, pointed out a major challenge: the lack of extensive datasets for animal communications. He compared it to the trillion-word datasets needed for training models like ChatGPT.
The Sarasota Dolphin Research Program has built an extensive 40-year archive of dolphin whistles, enabling Sayigh's team to use deep learning tools in their analysis. This approach holds promise for eventually cracking the code behind dolphin communication.
Global Efforts in Animal Communication
The award ceremony recognized four international teams, including studies on nightingales, marmoset monkeys, and cuttlefish. The Coller Dolittle Challenge is a joint initiative by the Jeremy Coller Foundation and Tel Aviv University. Submissions for next year’s prize will open in August.
AI is Accelerating Animal Language Research
Research on animal communication isn’t new, but AI is enabling larger datasets and faster analysis. Kate Zacarian, CEO of the Earth Species Project, highlighted how AI offers new capabilities beyond speed, allowing researchers to study communication as dynamic and structured rather than isolated signals.
Zacarian praised Sayigh’s team for their achievement, noting it will raise awareness about non-human communication studies and the role AI can play in this field.
The Earth Species Project recently released NatureLM audio, an open-source large audio language model for animal sound analysis. They are collaborating with biologists to study species such as orcas, carrion crows, and jumping spiders, with findings expected later this year.
Further Learning
- Explore AI applications in natural language analysis at Complete AI Training.
- Learn about advanced AI tools for audio and speech processing in our AI tool databases.