AI detects a hidden second lion roar
Date: November 22, 2025 | Source: University of Exeter
Scientists have identified a second kind of lion roar-an intermediary roar that sits alongside the classic full-throated version. Using AI to classify acoustic patterns, the system sorted roars with 95.4% accuracy and consistently tagged individuals without relying on subjective human judgment.
Why this matters for conservation
African lions are listed as vulnerable by the IUCN, with only an estimated 20,000-25,000 left in the wild. Populations have fallen by roughly half over the past 25 years. Smarter, scalable monitoring is urgently needed.
The newly recognized intermediary roar expands the acoustic signature available for tracking. Paired with passive acoustic monitoring, it can improve how teams estimate population sizes, map territories, and detect presence across large, hard-to-survey areas.
What the study found
The research, published in Ecology and Evolution, shows that a lion's roar sequence includes both the traditional full-throated roar and a distinct intermediary type. AI models automatically sorted recordings into these categories and then distinguished individual lions with high precision.
This approach reduces bias from expert interpretation and creates a repeatable pipeline for long-term monitoring. It also makes acoustic surveys more accessible than methods that depend on spoor surveys or camera traps alone.
How researchers can apply this now
- Integrate passive acoustic recorders to continuously capture roar sequences across core and edge habitats.
- Build labeled libraries of individual signatures to support re-identification over months and seasons.
- Combine acoustic detections with camera trapping and spoor data to cross-validate presence and movement.
- Use intermediary vs. full-throated roar ratios as an additional variable in occupancy and demographic models.
- Standardize annotation protocols to keep datasets comparable across projects and time.
Accuracy and scalability
A 95.4% classification accuracy moves lion bioacoustics from promising to practical. Automated pipelines can triage thousands of hours of audio, flag likely individuals, and surface changes in activity patterns that warrant field follow-up.
For teams working across wide landscapes, this means fewer missed detections and more consistent datasets to inform protection, patrols, and community engagement.
Collaboration behind the work
The project was led by the University of Exeter in partnership with the Wildlife Conservation Unit at the University of Oxford, Lion Landscapes, Frankfurt Zoological Society, TAWIRI (Tanzania Wildlife Institute for Research), and TANAPA (Tanzania National Parks Authority). Computer scientists from Exeter and Oxford contributed to the modeling. Funding came from the Lion Recovery Fund, WWF Germany, the Darwin Initiative, and the UKRI AI Centre for Doctoral Training in Environmental Intelligence.
Related resources
Upskilling for AI-driven field research
If you're building acoustic or ML workflows for conservation, you can explore practical AI programs curated for researchers here: AI courses by job.
Your membership also unlocks: