Food plays a big role in our health, and for that reason many people trying to improve their diet often track what they eat. A new wearable from researchers in Carnegie Mellon Universityโsย School of Computer Scienceย helps wearers track their food habits with high fidelity.
FitByte, a noninvasive, wearable sensing system, combines the detection of sound, vibration and movement to increase accuracy and decrease false positives. It could help users reach their health goals by tracking behavioral patterns, and gives practitioners a tool to understand the relationship between diet and disease and to monitor the efficacy of treatment.
The device tracks all stages of food intake. It detects chewing, swallowing, hand-to-mouth gestures and visuals of intake, and can be attached to any pair of consumer eyeglasses. โThe primary sensors on the device are accelerometers and gyroscopes, which are in almost every device at this point, like your phones and your watches.,โ saidย Mayank Goel, an assistant professor in theย Institute for Software Researchย and theย Human-Computer Interaction Institute.
An infrared proximity sensor detects hand-to-mouth gestures. To identify chewing, the system monitors jaw motion using four gyroscopes around the wearerโs ears. The sensors look behind the ear to track the flexing of the temporal muscle as the user moves their jaw. High-speed accelerometers placed near the glassesโ earpiece perceive throat vibrations during swallowing. This technology addresses the longstanding challenge of accurately detecting drinking, and the intake of soft things like yogurt and ice cream.
A small camera at the front of the glasses points downward to capture just the area around the mouth and only turns on when the model detects the user eating or drinking. โTo address issues of privacy, weโre currently processing everything offline,โ said Abdelkareem Bedri, an HCII doctoral student. โThe captured images are not shared anywhere except the userโs phone.โ
At this point, the system relies on users to identify the food and drink in photos. But the research team has plans for a larger test deployment, which will supply the data deep learning models need to automatically discern food type.
FitByte was tested in five unconstrained situations including a lunch meeting, watching TV, having a quick snack, exercising in a gym and hiking outdoors. Modeling across such noisy data allows the algorithm to generalize across conditions.
โOur team can take sensor data and find behavior patterns. In what situations do people consume the most? Are they binge eating? Do they eat more when theyโre alone or with other people? We are also working with clinicians and practitioners on the problems theyโd like to address,โ Goel said.
The team will continue developing the system by adding more noninvasive sensors that will allow the model to detect blood glucose levels and other important physiological measures. The researchers are also creating an interface for a mobile app that could share data with users in real time.
Other contributing researchers include CMU students Diana Li, Rushil Khurana and Kunal Bhuwalka. The paper was accepted by the Conference on Human Factors in Computing Systems (CHI 2020), which was scheduled for this month but canceled due to the COVID-19 pandemic.
If our reporting has informed or inspired you, please consider making a donation. Every contribution, no matter the size, empowers us to continue delivering accurate, engaging, and trustworthy science and medical news. Independent journalism requires time, effort, and resourcesโyour support ensures we can keep uncovering the stories that matter most to you.
Join us in making knowledge accessible and impactful. Thank you for standing with us!