UC San Diego researcher uses dog soundboards and citizen science to study animal minds and AI meaning

Dogs trained on soundboard buttons appear to combine words purposefully, not randomly-findings that also expose a core gap in AI. Like LLMs, dogs learn patterns, but neither necessarily grasps meaning.

Categorized in: AI News Science and Research
Published on: Mar 18, 2026
UC San Diego researcher uses dog soundboards and citizen science to study animal minds and AI meaning

What Dogs' Word Buttons Reveal About AI and Animal Minds

Federico Rossano's research on dogs using soundboard buttons to communicate has drawn millions of viewers to a recent NOVA documentary. But the work extends far beyond viral videos. His findings suggest that button-trained dogs learn a shared communication system with humans, responding to recorded words as consistent cues and, in some cases, combining buttons in ways that appear purposeful rather than random.

Rossano, an associate professor of cognitive science at UC San Diego and director of the Comparative Cognition Lab, paired controlled experiments with one of the largest citizen-science datasets in animal communication history-gathered from thousands of dogs in real homes worldwide. The research has shifted how scientists approach studying animal minds.

Why Labs Miss What Homes Reveal

Traditional animal research isolates subjects in controlled lab settings, often creating stress that skews results. Rossano's approach moves beyond that model. His team travels to homes, tests dogs with their owners present, and compares performance across different contexts.

"Many scientists refrain from citizen science because the claim is that we cannot trust what people do at home," Rossano said. "But I have found that engaging with the community leads to exciting insights that would be completely lost if we refrain from engaging with the people who actually live with pets every day."

The shift reflects a broader change in how science operates. Modern tools can analyze messy, large-scale data in ways that weren't possible decades ago. Real-world observations now complement controlled experiments rather than replace them.

What Dogs' Buttons Tell Us About AI Understanding

The parallel between how dogs learn button patterns and how large language models work is direct. Both start by learning statistical associations: a dog learns that pressing "treat" produces a reward; an LLM learns which words typically follow others based on training data.

But there's a critical gap. "Current LLMs are learning statistical regularities," Rossano said. "The issue is that we do not quite know to what degree AI understands 'meaning' because they do not have 'world knowledge.'"

A two-year-old asking for a credit card after watching a parent buy toys demonstrates the problem. The child produces the correct word in context but lacks understanding of what a credit card actually is. AI systems face the same challenge at scale.

For researchers studying animal communication with AI tools, the stakes are high. AI excels at identifying patterns-which whale songs exist, how often they occur, what follows them. But without knowing what those signals actually mean, researchers can only predict sequences, not interpret content.

Mapping Animal Cognition Beyond IQ

Understanding what animals know matters for welfare and conservation. Knowing whether a species notices when a member goes missing, how long they remember, and whether they can communicate pain or emotional states changes how humans should treat them.

Some animals also detect environmental changes humans miss. Dogs' sense of smell is unmatched. Certain species show early signs of earthquakes or wildfires. If researchers can decode what animals perceive and communicate, humans could benefit from that knowledge.

"After all, we have built airplanes because humans could not stop being mesmerized by birds flying," Rossano said.

The Next Frontier: Primate Behavior Models and Wild Animal Tracking

Rossano's lab is developing "Primate-GPT," a foundational model using computer vision and multi-modal AI to track primate behavior and communication in context. Understanding who does what, with whom, and when reveals patterns that improve animal communication research.

For conservation, the work addresses a practical problem: monitoring wild animals without invasive methods like darting. Tracking relationships between individuals matters. A primate is more likely to follow family or close associates than strangers-just as a human would.

"If we understand who they are to each other, it is an important predictor of who they will follow and what they will do next," Rossano said. "Tracking relationships and behavior over time is a new frontier for AI for conservation."

Current Research Questions

Rossano's team is investigating whether dogs and cats can reliably communicate about pain, emotional states, and needs for help. They're also testing whether animals can combine buttons to express concepts without direct word equivalents-a skill called "productivity" that humans demonstrate constantly.

If dogs can grow their communicative signals to match changing environments, it would suggest they experience the world in ways more similar to human thinking than previously understood. That finding would also address open questions about animal sentience and consciousness.

For professionals working in AI for Science & Research or studying how generative AI and LLM systems process meaning, the implications are direct: animal communication research offers a testable framework for understanding what AI systems actually know versus what they merely predict.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)