Reverse engineering the brain to model mind-body interactions

When you grab a cold beer out of the cooler this summer, what is really going on between your brain, your eyes and your hands?

“It is still a mystery, really,” says UBC computer science professor Prof. Dinesh Pai. “No one has ever completely mapped out the processes at the level of specific neurons, muscles and tendons.”

Pai is part of a UBC team leading an international initiative to do just that. “Essentially, we are reverse engineering the brain to produce the first working computational model of the complex interplay between our minds and our bodies.”

The project could produce great leaps forward in many areas, including medicine, industry and robotics. Although the project is just ramping up, the team’s mapping and modeling expedition is already producing some of the world’s most realistic computer simulations of the human body.

“Our research is really guided by a desire to determine and model exactly what is happening under our skin, first and foremost,” says Pai, who recently received $500,000 from UBC’s Peter Wall Institute for the project. “There will be many exciting outcomes from this project, but it really falls under the category of pure research.”

“Current robots have as much in common with human movements as helicopters do with seagulls,” Pai adds. “The challenges are similar, but they use completely different solutions.”

Pai’s five UBC co-investigators include Prof. John Steeves, Director of International Collaboration on Repair Discoveries (ICORD); Prof. Martin McKeown of the Pacific Parkinson’s Research Centre; Prof. Alan Mackworth, Computer Science; Prof. Tony Hodgson, Mechanical Engineering, and Prof. Tim Ingliss, School of Human Kinetics.

To make the project a reality, they have brought together a multidisciplinary dream team from Canada (UBC, McGill), the U.S. (UCLA, University of Washington, Northwestern University, Smith Kettlewell Eye Research Institute), Japan (Digital Human Research Centre) and Italy (Santa Lucia Foundation.)

Using magnetic resonance imaging (MRI), the team is cataloging body parts and functions and tracing their interactions with the brain. This information is being used to create a working three-dimensional computer model of all these functions.

“We are in uncharted territory, in terms of computing,” says Pai. “It’s not like you can find software like this at your local Future Shop or Best Buy. So we have been creating our own as we go along.”

Down the road, the team’s findings will enable doctors to test surgical outcomes before picking up a scalpel, Pai says.

“There is an amazing amount of variance between humans – skeletons, organs, muscles can all differ in size from person to person,” says Pai. “That means there is always some guesswork involved in surgery.”

“But if you can give someone an MRI and create a personalized computer model, suddenly a doctor has more information to work with,” he says. “They can say, ‘If I cut this tendon, what exactly is going to happen, given this patient’s unique body.’”

Advances in the field of neuroprosthetics — devices that replace or improve the function of an impaired nervous system — is another desired research outcome, Pai says.

“With a better understanding of mind-body connections, we hope to be able to use electrodes in the brain or spinal cord to restore some functions in people who have experienced strokes or some other disability.”

While these applications are still years away, the field of digital animation is taking note of their research. The upcoming prestigious computer science conference SIGGRAPH will publish research by Pai and PhD candidate Shinjiro Sueda that outlines how the team’s modeling of body movements can help to make digital animations of humans more realistic.

For more information, visit www.cs.ubc.ca/~pai

The material in this press release comes from the originating research organization. Content may be edited for style and length. Want more? Sign up for our daily email.