New! Sign up for our email newsletter on Substack.

Scientists Read Mice’s Minds Through Facial Twitches

The next time you are trying to hide what you are thinking, you might want to keep your face very, very still. Scientists at the Champalimaud Foundation in Portugal have demonstrated something unsettling: they can decode what mice are “thinking” just by watching subtle movements in their faces. Not emotions – actual cognitive strategies, the silent calculations happening behind those tiny whiskers.

The research, published in Nature Neuroscience, shows that machine learning algorithms can extract hidden decision-making processes from video recordings alone, with accuracy matching what dozens of surgically implanted brain electrodes would reveal. The implications reach far beyond understanding rodent psychology.

“To our surprise, we found that we can get as much information about what the mouse was ‘thinking’ as we could from recording the activity of dozens of neurons,” said Zachary Mainen, a principal investigator at the Champalimaud Foundation.

The experiment started simple enough. Researchers challenged mice with a foraging puzzle involving two water spouts, only one of which dispensed sugary rewards at any given time. The catch: which spout worked switched unpredictably, forcing mice to develop strategies for deciding when to abandon one spout and try the other.

Multiple Minds in One Brain

What happened next surprised even the researchers. They discovered that mouse brains simultaneously compute multiple problem-solving strategies, even though the animal only uses one at a time. Think of it as your brain constantly running background calculations for decisions you are not even making. The team could detect all these parallel strategies in the mice’s neural activity. Then they wondered: could the face reveal these hidden thoughts too?

Using high-speed cameras capturing 60 frames per second, the team recorded subtle facial movements – twitches around the nose, shifts near the cheeks, micro-movements of the mouth – while simultaneously monitoring brain activity through implanted electrodes. They fed this data into machine learning algorithms, which learned to match specific facial patterns with specific cognitive strategies.

The results were striking. Facial movements predicted not only which strategy a mouse was actively using but also the alternative strategies its brain was computing in the background. These weren’t emotional expressions but reflections of pure cognition: mathematical calculations about risk, reward timing, and when to switch tactics.

A Mirror for the Mind

First author Fanny Cazettes, now at the French National Centre for Scientific Research, noted something particularly intriguing about these facial patterns. Similar expressions appeared across different mice when they used the same cognitive strategy, suggesting these expressions might be as stereotyped as emotional facial movements. A furrowed nose means one type of calculation; a particular cheek twitch signals another.

The researchers went further, using optogenetics to temporarily silence a brain region called the secondary motor cortex while filming the mice. When this area went quiet, the facial expressions of cognitive strategies became delayed and less accurate. This suggests the brain is actively broadcasting its calculations through the face, not just coincidentally twitching muscles.

The timing proved crucial. Neural activity in the secondary motor cortex reflected decision variables about 50 milliseconds before those same variables showed up in facial movements. By contrast, other brain regions like the orbitofrontal cortex showed these patterns only after they appeared on the face, suggesting they were responding to feedback from the facial movements themselves.

“Similar facial patterns represented the same strategies across different mice. This suggests that the reflection of specific patterns of thought at the level of facial movement might be stereotyped, much like emotions,” explained co-author Davide Reato.

The study analyzed 58 behavioral sessions across 17 mice, combining traditional neuroscience techniques with computer vision. The team recorded from the secondary motor cortex (averaging 67 neurons per session), orbitofrontal cortex (58 neurons), and olfactory cortex (28 neurons), comparing how well each brain region predicted the hidden cognitive states versus how well simple video analysis performed.

Think about the ubiquity of cameras in modern life. Smartphones, laptops, security systems, even your doorbell potentially capturing not just what you do but computational processes your brain runs without your awareness. The researchers themselves flag this concern. Alfonso Renart, another principal investigator on the study, acknowledged the tension: “Our study shows that videos are not just records of behavior – they can also provide a detailed window into brain activity.”

From a research perspective, the technique offers a non-invasive way to study brain function, potentially revolutionizing how scientists understand neurological and psychiatric conditions. No surgery, no implants – just video and algorithms. But the same technology that could diagnose disease might also read thoughts you never intended to share.

The mice cannot consent to having their cognitive processes decoded. Humans, increasingly, might not realize their mental privacy is being compromised either. The researchers argue their findings highlight the need for regulations protecting mental privacy, though what that would look like remains unclear. Should there be limits on how sophisticated video analysis can become? Who owns the cognitive information extracted from your facial movements?

For now, the technique works on mice performing a specific task. Whether it scales to the vastly more complex human brain remains unknown. But the principle is established: the face is not just an emotional billboard. It is a high-bandwidth information channel, constantly broadcasting the hidden calculations of the mind, whether we know it or not.

Nature Neuroscience: 10.1038/s41593-025-02071-5


Quick Note Before You Read On.

ScienceBlog.com has no paywalls, no sponsored content, and no agenda beyond getting the science right. Every story here is written to inform, not to impress an advertiser or push a point of view.

Good science journalism takes time — reading the papers, checking the claims, finding researchers who can put findings in context. We do that work because we think it matters.

If you find this site useful, consider supporting it with a donation. Even a few dollars a month helps keep the coverage independent and free for everyone.


Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.