More than 1 million Americans require daily physical assistance to get dressed because of injury, disease and advanced age. Robots could potentially help, but cloth and the human body are complex.
To help address this need, a robot at the Georgia Institute of Technology is successfully sliding hospital gowns on peopleโs arms. The machine doesnโt use its eyes as it pulls the cloth. Instead, it relies on the forces it feels as it guides the garment onto a personโs hand, around the elbow and onto the shoulder.
The machine, a PR2, taught itself in one day, by analyzing nearly 11,000 simulated examples of a robot putting a gown onto a human arm. Some of those attempts were flawless. Others were spectacular failures โ the simulated robot applied dangerous forces to the arm when the cloth would catch on the personโs hand or elbow.
From these examples, the PR2โs neural network learned to estimate the forces applied to the human. In a sense, the simulations allowed the robot to learn what it feels like to be the human receiving assistance.
โPeople learn new skills using trial and error. We gave the PR2 the same opportunity,โ said Zackory Erickson, the lead Georgia Tech Ph.D. student on the research team. โDoing thousands of trials on a human would have been dangerous, let alone impossibly tedious. But in just one day, using simulations, the robot learned what a person may physically feel while getting dressed.โ
The robot also learned to predict the consequences of moving the gown in different ways. Some motions made the gown taut, pulling hard against the personโs body. Other movements slid the gown smoothly along the personโs arm. The robot uses these predictions to select motions that comfortably dress the arm.
After success in simulation, the PR2 attempted to dress people. Participants sat in front of the robot and watched as it held a gown and slid it onto their arms. Rather than vision, the robot used its sense of touch to perform the task based on what it learned about forces during the simulations.
โThe key is that the robot is always thinking ahead,โ said Charlie Kemp, an associate professor in theย Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University and the lead faculty member. โIt asks itself, โif I pull the gown this way, will it cause more or less force on the personโs arm? What would happen if I go that way instead?โโ
The researchers varied the robotโs timing and allowed it to think as much as a fifth of a second into the future while strategizing about its next move. Less than that caused the robot to fail more often.
โThe more robots can understand about us, the more theyโll be able to help us,โ Kemp said. โBy predicting the physical implications of their actions, robots can provide assistance that is safer, more comfortable and more effective.โ
The robot is currently putting the gown on one arm. The entire process takes about 10 seconds. The team says fully dressing a person is something that is many steps away from this work.
Ph.D. student Henry Clever and Professors Karen Liu and Greg Turk also contributed to the research. Their paper, Deep Haptic Model Predictive Control for Robot-Assisted Dressing, will be presented May 21-25 in Australia during the International Conference on Robotics and Automation (ICRA). The work is part of a larger effort on robot-assisted dressing funded by the National Science Foundation (NSF) and led by Liu.
If our reporting has informed or inspired you, please consider making a donation. Every contribution, no matter the size, empowers us to continue delivering accurate, engaging, and trustworthy science and medical news. Independent journalism requires time, effort, and resourcesโyour support ensures we can keep uncovering the stories that matter most to you.
Join us in making knowledge accessible and impactful. Thank you for standing with us!