Researchers track mouse facial expressions to measure anesthesia depth without brain electrodes
Scientists at Cold Spring Harbor Laboratory have developed a camera system that measures subtle changes in mouse facial muscles with enough precision to match the accuracy of invasive EEG recordings. The system, called Cheese3D, uses machine learning to convert facial movements into quantifiable data about brain states.
The breakthrough matters because current methods for monitoring consciousness during anesthesia require electrodes attached to or implanted in the brain-a procedure that stresses animals and can distort their natural behavior. Cheese3D eliminates that constraint.
How the system works
Six synchronized high-speed cameras film a mouse's face from multiple angles. This multi-angle approach solves a basic anatomical problem: mouse faces are cone-shaped, making it difficult for a single camera to capture all relevant movements.
Machine learning models then process the footage like an expert film editor, stitching 2D video into a 3D dataset. The result is precise measurement of muscle tone and movement across the entire face-including the ears, eyes, whisker pad, and jaw-at sub-millimeter resolution.
The anesthesia test
In their main demonstration, researchers used Cheese3D to monitor mice under anesthesia. By tracking changes in facial muscle tone alone, they could predict how deeply sedated each animal was at any given moment. The accuracy matched gold-standard EEG methods.
The non-invasive approach means researchers can monitor brain states from across the room without attaching anything to the animal, allowing behavior to proceed naturally.
Potential applications beyond anesthesia
Assistant Professor Helen Hou, who leads the work, notes that facial movement emerges early in development. Humans smile before they crawl or walk. This timeline makes facial expressions a window into how the brain learns social communication.
That connection has implications for understanding autism and designing behavioral therapies. Researchers could use Cheese3D to track how facial movement develops normally and how it diverges in developmental conditions.
The team also plans to study facial expressions during disease states, building a more complete picture of how brain function maps onto observable behavior.
The path to human application
The ultimate goal is to develop similar tools for humans. If researchers can establish precise links between specific facial muscles and specific brain circuits in mice, they could create non-invasive diagnostic tools for humans-potentially safer ways to monitor patients during surgery or to detect behavioral disorders.
The work appears in Nature Neuroscience.
Your membership also unlocks: