Monkeys’ Brains See Robot Arms as Their Own

Monkeys that learn to use their brain signals to control a robotic arm are not just learning to manipulate an external device, Duke University Medical Center neurobiologists have found. Rather, their brain structures are adapting to treat the arm as if it were their own appendage.

The finding has profound implications both for understanding the extraordinary adaptability of the primate brain and for the potential clinical success of brain-operated devices to give the handicapped the ability to control their environment, said the researchers.

Led by neurobiologist Miguel Nicolelis of Duke’s Center for Neuroengineering, the researchers published their findings in the May 11, 2005, issue of the Journal of Neuroscience. Lead author on the paper was Mikhail Lebedev in Nicolelis’s laboratory. Other coauthors were Jose Carmena, Joseph O’Doherty, Miriam Zacksenhouse, Craig Henriquez and Jose Principe. The work was supported by the Defense Advanced Research Projects Agency, the James S. McDonnel Foundation, the National Institutes of Health, the National Science Foundation and the Christopher Reeve Paralysis Foundation.

In the study, Lebedev performed detailed analysis of the mass of neural data that emerged from experiments reported in 2003, in which the researchers discovered for the first time that monkeys were able to control a robot arm with only their brain signals.

In those experiments, the researchers first implanted an array of microelectrodes — each thinner than a human hair — into the frontal and parietal lobes of the brains of two female rhesus macaque monkeys. The faint signals from the electrode arrays were detected and analyzed by the computer system the researchers developed to recognize patterns of signals that represented particular movements by an animal’s arm.

In the initial behavioral experiments, the researchers recorded and analyzed the output signals from the monkeys’ brains as the animals were taught to use a joystick to both position a cursor over a target on a video screen and to grasp the joystick with a specified force.

After the animals’ initial training, however, the researchers made the cursor more than a simple display. They incorporated into its movement the dynamics, such as inertia and momentum, of a robot arm functioning in another room. While the animals’ performance initially declined when the robot arm was included in the feedback loop, they quickly learned to allow for these dynamics and became proficient in manipulating the robot-reflecting cursor, found the scientists.

The scientists next removed the joystick, after which the monkeys continued to move their arms in mid-air to manipulate and “grab” the cursor, thus controlling the robot arm. However, after a few days, the monkeys realized that they did not need to move their own arms. Their arm muscles went completely quiet, they kept the arm at their side, and they controlled the robot using only their brain and visual feedback.

“After these experiments, a major question remained about how the animals’ brains adapted to the transition between joystick and brain control,” said Nicolelis. “Thus, drawing on the extensive data from these experiments Mikhail analyzed very carefully what happens functionally to the brain cells and the brain cell ensembles in multiple brain areas during this transition.

“And basically we were able to show clearly that a large percentage of the neurons become more ‘entrained’ — that is, their firing becomes more correlated to the operation of the robot arm than to the animal’s own arm.”

According to Nicolelis, the analysis revealed that, while the animals were still able to use their own arms, some brain cells formerly used for that control shifted to control of the robotic arm.

“Mikhail’s analysis of the brain signals associated with use of the robotic and animals’ actual arms revealed that the animal was simultaneously doing one thing with its own arm and something else with the robotic arm,” he said. “So, our hypothesis is that the adaptation of brain structures allows the expansion of capability to use an artificial appendage with no loss of function, because the animal can flip back and forth between using the two. Depending on the goal, the animal could use its own arm or the robotic arm, and in some cases both.

“This finding supports our theory that the brain has extraordinary abilities to adapt to incorporate artificial tools, whether directly controlled by the brain or through the appendages” said Nicolelis. “Our brain representations of the body are adaptable enough to incorporate any tools that we create to interact with the environment. This may include a robot appendage, but it may also include using a computer keyboard or a tennis racket. In any such case, the properties of this tool become incorporated into our neuronal ‘space’,” he said. According to Nicolelis, such a theory of brain adaptability has been controversial.

“Few researchers have been willing to go as far as postulating such extraordinary adaptability for the brain and how important this adaptability of brain circuitry is in enabling us to learn to use tools,” he said. “It has long been appreciated that adaptability is a key capability of the prefrontal cortex that is a hallmark of the human brain. It gives us the ability to design, create, and use tools to do everything from lift massive weights to make microscopic manipulations.

“What Mikhail, I and our colleagues are suggesting is that a fundamental trait of higher primates, in particular apes and humans, is the ability to incorporate these tools into the very structure of the brain. In fact, we’re saying that it’s not only the brain that is adaptable; it’s the whole concept of self. And this concept of self extends to our tools. Everything from cars to clothing that we use in our lives becomes incorporated into our sense of self. So, our species is capable of ‘evolving’ the perception of what we are.

“From a philosophical point of view, we’re saying that the sense of self is not limited to our capability for introspection, our sense of our body limits, and to the experiences we’ve accumulated,” Nicolelis said. “It really incorporates every external device that we use to deal with the environment.” The findings also have important clinical significance, said Nicolelis.

“The experiments we have conducted not only represent a proof of concept that such an external device can be directly controlled in a clinical setting,” he said. “This latest analysis shows that the device is incorporated very intimately as a natural extension of the brain. This is a fundamentally important property if brain-machine interface technology is to have any clinical future. If the brain was essentially static, then paralyzed people would never be able to adapt to operate external devices with enough dexterity to make them really useful.”

Importantly, said Nicolelis, truly useful “neuroprosthetic” devices will have to be dexterous enough to give patients a full range of mobility in robot arms, hands or other appendages. “Our studies show that it will not be enough to implant a few electrodes, measure a few signals and attain sufficient capability for useful devices,” he said. “The ability to merely move a cursor on a screen or open or close an artificial hand is not enough to justify the use of such systems.” Rather, he said, the objective in his laboratory is to develop devices that offer paralyzed people fully functional artificial appendages.

For example, he said, new experiments in his laboratory seek to enable the brain to perceive a feedback sensation from neuroprosthetic devices. Such feedback might be in the form of visual information on the effects of moving a robotic arm. Or, it might be tactile feedback fed as signals into electrodes implanted in the brain.

Such feedback would greatly enhance people’s ability to learn and use the devices, said Nicolelis. Also, such feedback would expand use of neuroprosthetics to amputees, because the devices would include all the features — including feedback — of real appendages.

“In our new experiments, the idea is that by using vision and touch, we’re actually going to create inside the brains of these animal a vivid perceptual image of what it is to have a third arm,” he said.

From Duke University


Substack subscription form sign up