With his eyes closed and electrodes connecting his brain to a tablet computer, a man with tetraplegia describes the warm, silky fur of a cat beneath his fingertips. Another participant reports feeling the smooth, cool surface of an apple and the rough texture of a terry cloth towel – all through a hand that hasn’t felt anything in years.
This remarkable achievement in touch restoration comes from a collaborative study between the University of Pittsburgh and University of Chicago researchers published today in Nature Communications, where scientists gave participants unprecedented control over their artificial sensations.
Previous brain implant systems could create some sense of touch, but according to the researchers, these sensations often felt like indistinct buzzing or tingling and didn’t vary between objects. The key innovation in this study was giving BCI users control over the details of the electrical stimulation that creates tactile sensations, rather than having scientists make those decisions themselves.
“Touch is an important part of non-verbal social communication; it is a sensation that is personal and that carries a lot of meaning,” said lead author Ceci Verbaarschot, who conducted the research as a postdoctoral fellow at the University of Pittsburgh before joining the University of Texas-Southwestern. “Designing their own sensations allows BCI users to make interactions with objects feel more realistic and meaningful, which gets us closer to creating a neuroprosthetic that feels pleasant and intuitive to use.”
Participants rarely confused objects with very different tactile properties but more frequently mixed up objects sharing similar characteristics. For example, they were less likely to confuse a cat for a key, but might mistake a soft towel for a cat.
The research team worked with three men who had lost sensation in their hands due to spinal cord injuries. Each participant had tiny electrode arrays implanted in the somatosensory cortex – the brain region that processes touch information. These electrodes delivered small, precisely controlled electrical pulses to specific brain cells.
Unlike previous approaches where researchers predetermined stimulation patterns, this study introduced a new method: participants adjusted stimulation parameters themselves in real-time while interacting with virtual objects on a tablet screen.
The results revealed remarkably vivid and appropriate tactile experiences. When creating sensations for an apple, one participant reported it felt “light but also smooth, curved and a little bit of cool and wet.” For a cat, another described “very light touch, just like petting a cat. Smooth silkiness on fingertips. Resistance of cat. Has that oily sensation. It even has a sort of warmth to it.”
To test whether these sensations truly represented distinct objects, participants later received stimulation without seeing any images and had to identify which object the sensation represented. Two of the three participants performed significantly above chance level, correctly identifying objects about 35% of the time.
Even more telling was the pattern of mistakes. Participants rarely confused objects with very different tactile properties but more frequently mixed up objects sharing similar characteristics. For example, they were less likely to confuse a cat for a key, but might mistake a soft towel for a cat.
The study found that when selecting stimulation patterns, participants consistently chose distinct parameter combinations for objects with different compliance (softness versus hardness) and temperature (warm versus cool). This suggests the brain can interpret complex stimulation patterns as coherent object properties.
While the research represents substantial progress, challenges remain before fully intuitive touch restoration becomes reality. Two participants showed consistent ability to identify objects from sensations, but the third participant, who spent less time exploring different stimulation patterns, performed at chance level.
“We designed this study to shoot for the moon and made it into orbit,” said senior author Robert Gaunt, associate professor at the University of Pittsburgh. “Participants had a really hard task of distinguishing between objects by tactile sensation alone and they were quite successful at it. Even when they made mistakes, those mistakes were predictable: it’s harder to tell apart a cat and a towel since both are soft, but they were less likely to confuse a cat for a key.”
The findings open new possibilities for brain-computer interfaces that could eventually restore natural-feeling touch to prosthetic limbs. Scientists hope this approach of user-guided sensation creation could lead to more personalized and intuitive neuroprosthetic systems.
The research team plans to explore whether similar techniques could be used to restore other aspects of sensation beyond touch, potentially including proprioception – the sense of body position – which is crucial for natural movement.
Discover more from NeuroEdge
Subscribe to get the latest posts sent to your email.