Self-driving cars that “see” through obstacles and maneuver accordingly. A sensor that scans a liquid for calorie count and dangerous or poisonous content. A drone that detects exact amounts of soil moisture while hovering above crop fields. All of these futuristic-sounding devices are moving closer to reality, thanks to research being conducted by Mahanth Gowda, assistant professor of computer science and engineering at Penn State, with a three-year, $250,000 National Science Foundation grant.
Gowda is using wireless technology to develop IoTScope, a new system that will identify the material properties of any object by analyzing reflected and refracted radio-frequency (RF) signals similar to ones used for WiFi and cellular communications. To understand how the technology works, Gowda recommends thinking first of how a camera takes a photograph.
“When a camera takes an image, it’s measuring the light that is reflected from the surface of an object,” he said. “But with sensing with RF signals, even though there are a lot of ambient electro-magnetic RF signals in the environment, they are typically not strong enough to capture and sense like the way we measure light off an object for a photograph. So, what we do here is generate our own RF signals systematically to illuminate the environment and sense it.”
Two different methods generate these signals. The first is through a direct send, which occurs when the image of an object is taken through direct reflections. In this scenario, there are two antennas: one for transmitting and one for receiving. The transmitting antenna sends signals that illuminate the environment, and the receiving antenna captures the reflections from the environment. Properties of objects can then be measured by evaluating these reflections.
The second method involves passing a signal through an object to discern its properties.
“Imagine that you have a transmitter on one side of an opaque water bottle, and a receiver on the other side of it,” Gowda said. “Since the water bottle is not transparent, light and visible signals cannot penetrate it, but radio waves can penetrate most non-metallic objects. By measuring the properties of the signals, like how much loss you experience in the signal, how much delay you experience in the signal, you can tell whether this water is, for example, contaminated.”
Although Gowda has published preliminary results on the second method, it is still an active area of exploration and a key part of the novelty of his work. He explained that this method could improve the capabilities of self-driving cars.
“Self-driving cars commonly use computer vision and sensors like LiDAR (light detection and ranging),” he said. “But the computer vision and sensors do not work well under adverse weather conditions — poor visibility, a lot of snow — or in low-light conditions. So, in those situations if we can use radar-based sensing like WiFi kind of signals, and sense reflections from humans and objects and other cars, we should be able to direct them. If you combine vision with signals from radar, you can improve the robustness and reliability of autonomous driving operations.”
Gowda said he will use simulations, real-world experiments, data analysis modeling and machine learning approaches as he improves the IoTScope system. He hopes the next steps will be commercialization.
“There’s a wide range of industries that might be interested in this kind of technology,” he said. “Most of the things we are working on very much have a real impact in life with respect to self-driving cars, airports and other types of security and health care applications.”