If it pans out at real-world dinner tables, a freshly cooked-up AI system will soon be counting calories and sniffing out macronutrients just by gobbling up images of meals.
And that’s in the eater’s choice of two photos or one video.
The system was developed by academic endocrinologists, dietary experts and biomedical engineers in Switzerland with support from research colleagues in the U.S. It’s described in a study running in the open-access journal Sensors.
Stavroula Mougiakakou, PhD, of the University of Bern, Elias Spanakis, MD, of the University of Baltimore and co-authors state their “dietary assessment” system, which they’ve dubbed goFOOD, uses deep neural networks to process the images. To estimate the food’s volume, it employs a 3D reconstruction algorithm.
The authors say goFOOD works with more than 300 “fine grained” food categories and has been validated on two multimedia databases containing non-standardized and fast-food meals.
In machine vs. human experiments detailed in the study, the system performed better than experienced dietitians in one contest and as well as the experts on another.
In their discussion section, the authors propose several scenarios in which such a system could be of value.
Dieters and health-conscious eaters come instantly to mind, of course, but other potential end users might include dietitians, clinicians and public nutrition departments tracking statistics for trends at the population level.
The goFOOD system is “under constant development aiming at improving its current features, in terms of accuracy, speed, user friendliness and also aiming at developing and integrating new features,” the authors note. “The most recent addition to the set of functionalities is the development and integration of a barcode scanner, so that packaged consumed products can also be accounted for.”
The study is posted in full for free.