Quantcast

Researchers help paralyzed man regain sense of touch through a robotic arm

For the first time ever, a human patient is able to experience the sense of touch through a robotic arm that he controls with his brain.

A team of researchers working with Sliman Bensmaia, associate professor of organismal biology and anatomy at the University of Chicago, and led by Robert Gaunt, assistant professor of physical medicine and rehabilitation at the University of Pittsburgh, developed a brain computer interface (BCI) that was surgically implanted in 28-year-old Nathan Copeland. The interface is connected to a robotic arm that transmits sensory feedback through electrodes implanted in areas of the brain responsible for hand movement and touch.

Copeland, seen in the video above, was paralyzed from the chest down in a car accident in 2004, unable to feel or move his lower arms and legs. In a study published today in Science Translational Medicine, the researches demonstrate how he is now able to distinguish between touches on individual fingers and the palm of the robotic arm with input from the BCI.

“I can feel just about every finger—it’s a really weird sensation,” Copeland said in a press release from Pitt and the University of Pittsburgh Medical Center. “Sometimes it feels electrical and sometimes its pressure, but for the most part, I can tell most of the fingers with definite precision. It feels like my fingers are getting touched or pushed.”

The system incorporates years of research by Bensmaia describing how the nervous system interprets sensory feedback as we touch or grasp objects, move our limbs and run our fingers along textured surfaces. In a series of experiments with monkeys, whose sensory systems closely resemble those of humans, Bensmaia identified patterns of neural activity that occur naturally as the animals manipulate objects, and successfully recreated those patterns by directly stimulating the nervous system with electrical signals.

That research, published most notably in the Proceedings of the National Academy of Sciences in 2013 and 2015, provided a blueprint for Gaunt and his team to recreate the sense of touch with the BCI using a “biomimetic” approach that approximates the natural, intact nervous system.

“If you want to create a dexterous hand for use in an amputee or a tetraplegic patient, you need to not only be able to move it, but have sensory feedback from it. To do this, we first need to look at how the intact hand and the intact nervous system encodes this information, and then, to the extent that we can, try to mimic that in a neuroprosthesis,” Bensmaia said. “This study shows how that approach works in a human, and for the first time he can actually move the robotic arm, reach for an object, grasp it and feel it through the arm. That’s pretty astounding.”

Bensmaia said that reproducing natural-feeling sensations is crucial to building neuroprosthetics that have the dexterity of native hands. With time, patients like Copeland will be able to view these robotic arms as an extension of themselves.

“As he uses this prosthesis and has this visual experience of seeing the robot touch things, and any time it touches something, feeling something in response, I think eventually he’s going to start to embody the robot,” he said. “The robot is going to feel like part of the body, maybe even actually supplanting his own arm because it’s going to be taking over what the arm used to do.”




The material in this press release comes from the originating research organization. Content may be edited for style and length. Want more? Sign up for our daily email.