New camera system reads mouse facial expressions to measure brain states without invasive electrodes

Cold Spring Harbor scientists built a 6-camera system that tracks mouse facial muscles to measure anesthesia depth as accurately as brain electrodes. The non-invasive tool could eventually help monitor human patients during surgery.

Categorized in: AI News Science and Research
Published on: Apr 28, 2026
New camera system reads mouse facial expressions to measure brain states without invasive electrodes

Researchers track mouse facial expressions to measure anesthesia depth without brain electrodes

Scientists at Cold Spring Harbor Laboratory have developed a camera system that measures subtle changes in mouse facial muscles with enough precision to match the accuracy of invasive EEG recordings. The system, called Cheese3D, uses machine learning to convert facial movements into quantifiable data about brain states.

The breakthrough matters because current methods for monitoring consciousness during anesthesia require electrodes attached to or implanted in the brain-a procedure that stresses animals and can distort their natural behavior. Cheese3D eliminates that constraint.

How the system works

Six synchronized high-speed cameras film a mouse's face from multiple angles. This multi-angle approach solves a basic anatomical problem: mouse faces are cone-shaped, making it difficult for a single camera to capture all relevant movements.

Machine learning models then process the footage like an expert film editor, stitching 2D video into a 3D dataset. The result is precise measurement of muscle tone and movement across the entire face-including the ears, eyes, whisker pad, and jaw-at sub-millimeter resolution.

The anesthesia test

In their main demonstration, researchers used Cheese3D to monitor mice under anesthesia. By tracking changes in facial muscle tone alone, they could predict how deeply sedated each animal was at any given moment. The accuracy matched gold-standard EEG methods.

The non-invasive approach means researchers can monitor brain states from across the room without attaching anything to the animal, allowing behavior to proceed naturally.

Potential applications beyond anesthesia

Assistant Professor Helen Hou, who leads the work, notes that facial movement emerges early in development. Humans smile before they crawl or walk. This timeline makes facial expressions a window into how the brain learns social communication.

That connection has implications for understanding autism and designing behavioral therapies. Researchers could use Cheese3D to track how facial movement develops normally and how it diverges in developmental conditions.

The team also plans to study facial expressions during disease states, building a more complete picture of how brain function maps onto observable behavior.

The path to human application

The ultimate goal is to develop similar tools for humans. If researchers can establish precise links between specific facial muscles and specific brain circuits in mice, they could create non-invasive diagnostic tools for humans-potentially safer ways to monitor patients during surgery or to detect behavioral disorders.

The work appears in Nature Neuroscience.


Get Daily AI News

Your membership also unlocks:

700+ AI Courses
700+ Certifications
Personalized AI Learning Plan
6500+ AI Tools (no Ads)
Daily AI News by job industry (no Ads)