January 27, 2014 |
Windshields may highlight road signs and potential hazards
Some automotive experts say that cars will virtually drive themselves some day.
But until that day comes, drivers will find themselves benefiting from advanced safety information systems, some of which are just around the corner.
One such system, called “augmented reality” (AR) cueing, works by superimposing a bright, yellow box around road signs and potential hazards as they appear in the windshield. As reported in a news release on the subject, it’s similar to a conventional “heads-up” windshield display, except that the system uses a computer-generated yellow highlight marker to attract the driver’s attention.
Examples of AR in everyday use are all around us, according to University of Iowa researcher Mark Schall, a doctoral student studying mechanical and industrial engineering in the UI College of Engineering.
“The best example of AR use in everyday life may be the yellow first down marker that is added to televised football games. The marker does not actually exist in the real-world, but the computer projection added to the image makes watching on television much more informative,” says Schall, corresponding author for a paper published in the October 2012 issue of Human Factors, the journal of the Human Factors and Ergonomics Society.
In the study, UI researchers and their colleagues tested an AR cueing system to see whether it could improve driving safety among the elderly, an age group that generally has diminished visual, cognitive, and physical ability.
Here’s how the Iowa test worked. Researchers recruited 20 elderly licensed drivers between the ages of 65 and 85 from the general population, excluding those with such medical conditions as anxiety and depression or those taking specific medications such as stimulants or narcotics.
Participants each spent one hour at UI Hospitals and Clinics in the cab of a 1994 Saturn that is part of the UI high-fidelity driving simulator called SIREN (Simulator for Interdisciplinary Research in Ergonomics and Neuroscience). Once in the simulator, they drove along six, 6-mile-long rural roadways. Study investigators noted whether participants detected hazardous target objects, whether AR cues affected their ability to detect nonhazardous objects, and whether drivers were able to maintain a safe distance behind a lead vehicle.
“Results of our study showed definite promise for AR cuing,” says Schall. “Specifically, we observed that AR cues helped drivers detect and respond more often and quickly to roadway objects that may be difficult to see while driving, such as a pedestrian alongside the roadway. In addition, AR cues did not impair ability to complete other driving tasks such as maintaining a safe distance behind a lead vehicle.”
Asked whether the driving public can look for AR hardware and software to appear in new vehicles in the near future, Schall is hopeful.
“I believe so. Many vehicle manufacturers and digital entertainment companies such as BMW and Pioneer, respectively, have already launched consumer-ready navigation displays that use AR to help guide drivers. If successful, I have no reason to believe similar products will not be available throughout the marketplace in the near future,” he says.
Schall’s colleagues in the study are: Michelle Rusch, UI postdoctoral fellow in neurology and mechanical and industrial engineering; John D. Lee, professor of industrial and systems engineering at the University of Wisconsin, Madison; Jeffrey Dawson, UI professor of biostatistics; Geb Thomas, UI associate professor of industrial engineering; Nazan Aksan, UI research scientist in the Department of Neurology; and Matthew Rizzo, UI professor of neurology, engineering, and public policy.
The study was funded under a grant from the National Institutes of Health (#R01AGO26027), awarded to Rizzo.