1
|
Ghosh T, McCrory MA, Marden T, Higgins J, Anderson AK, Domfe CA, Jia W, Lo B, Frost G, Steiner-Asiedu M, Baranowski T, Sun M, Sazonov E. I2N: image to nutrients, a sensor guided semi-automated tool for annotation of images for nutrition analysis of eating episodes. Front Nutr 2023; 10:1191962. [PMID: 37575335 PMCID: PMC10415029 DOI: 10.3389/fnut.2023.1191962] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/22/2023] [Accepted: 07/05/2023] [Indexed: 08/15/2023] Open
Abstract
Introduction Dietary assessment is important for understanding nutritional status. Traditional methods of monitoring food intake through self-report such as diet diaries, 24-hour dietary recall, and food frequency questionnaires may be subject to errors and can be time-consuming for the user. Methods This paper presents a semi-automatic dietary assessment tool we developed - a desktop application called Image to Nutrients (I2N) - to process sensor-detected eating events and images captured during these eating events by a wearable sensor. I2N has the capacity to offer multiple food and nutrient databases (e.g., USDA-SR, FNDDS, USDA Global Branded Food Products Database) for annotating eating episodes and food items. I2N estimates energy intake, nutritional content, and the amount consumed. The components of I2N are three-fold: 1) sensor-guided image review, 2) annotation of food images for nutritional analysis, and 3) access to multiple food databases. Two studies were used to evaluate the feasibility and usefulness of I2N: 1) a US-based study with 30 participants and a total of 60 days of data and 2) a Ghana-based study with 41 participants and a total of 41 days of data). Results In both studies, a total of 314 eating episodes were annotated using at least three food databases. Using I2N's sensor-guided image review, the number of images that needed to be reviewed was reduced by 93% and 85% for the two studies, respectively, compared to reviewing all the images. Discussion I2N is a unique tool that allows for simultaneous viewing of food images, sensor-guided image review, and access to multiple databases in one tool, making nutritional analysis of food images efficient. The tool is flexible, allowing for nutritional analysis of images if sensor signals aren't available.
Collapse
Affiliation(s)
- Tonmoy Ghosh
- Department of Electrical and Computer Engineering, University of Alabama, Tuscaloosa, AL, United States
| | - Megan A. McCrory
- Department of Health Sciences, Boston University, Boston, MA, United States
| | - Tyson Marden
- Colorado Clinical and Translational Sciences Institute, University of Colorado, Denver, CO, United States
| | - Janine Higgins
- Department of Medicine, University of Colorado Anschutz Medical Campus, Aurora, CO, United States
| | - Alex Kojo Anderson
- Department of Nutritional Sciences, University of Georgia, Athens, GA, United States
| | | | - Wenyan Jia
- Department of Electrical and Computer Engineering, University of Pittsburgh, Pittsburgh, PA, United States
| | - Benny Lo
- Department of Surgery and Cancer, Imperial College, London, United Kingdom
| | - Gary Frost
- Department of Metabolism, Digestion and Reproduction, Imperial College, London, United Kingdom
| | | | - Tom Baranowski
- Children’s Nutrition Research Center, Department of Pediatrics, Baylor College of Medicine, Houston, TX, United States
| | - Mingui Sun
- Department of Neurological Surgery, University of Pittsburgh, Pittsburgh, PA, United States
| | - Edward Sazonov
- Department of Electrical and Computer Engineering, University of Alabama, Tuscaloosa, AL, United States
| |
Collapse
|
2
|
Doulah A, Ghosh T, Hossain D, Marden T, Parton JM, Higgins JA, McCrory MA, Sazonov E. Energy intake estimation using a novel wearable sensor and food images in a laboratory (pseudo-free-living) meal setting: quantification and contribution of sources of error. Int J Obes (Lond) 2022; 46:2050-2057. [PMID: 36192533 DOI: 10.1038/s41366-022-01225-w] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Received: 12/18/2020] [Revised: 08/25/2022] [Accepted: 09/20/2022] [Indexed: 11/09/2022]
Abstract
OBJECTIVES Dietary assessment methods not relying on self-report are needed. The Automatic Ingestion Monitor 2 (AIM-2) combines a wearable camera that captures food images with sensors that detect food intake. We compared energy intake (EI) estimates of meals derived from AIM-2 chewing sensor signals, AIM-2 images, and an internet-based diet diary, with researcher conducted weighed food records (WFR) as the gold standard. SUBJECTS/METHODS Thirty adults wore the AIM-2 for meals self-selected from a university food court on one day in mixed laboratory and free-living conditions. Daily EI was determined from a sensor regression model, manual image analysis, and a diet diary and compared with that from WFR. A posteriori analysis identified sources of error for image analysis and WFR differences. RESULTS Sensor-derived EI from regression modeling (R2 = 0.331) showed the closest agreement with EI from WFR, followed by diet diary estimates. EI from image analysis differed significantly from that by WFR. Bland-Altman analysis showed wide limits of agreement for all three test methods with WFR, with the sensor method overestimating at lower and underestimating at higher EI. Nutritionist error in portion size estimation and irreconcilable differences in portion size between food and nutrient databases used for WFR and image analyses were the greatest contributors to image analysis and WFR differences (44.4% and 44.8% of WFR EI, respectively). CONCLUSIONS Estimation of daily EI from meals using sensor-derived features offers a promising alternative to overcome limitations of self-report. Image analysis may benefit from computerized analytical procedures to reduce identified sources of error.
Collapse
Affiliation(s)
- Abul Doulah
- Department of Electrical and Computer Engineering (ECE), The University of Alabama, Tuscaloosa, USA.,Department of Electrical and Electronic Engineering, University of Liberal Arts Bangladesh, Dhaka, Bangladesh
| | - Tonmoy Ghosh
- Department of Electrical and Computer Engineering (ECE), The University of Alabama, Tuscaloosa, USA
| | - Delwar Hossain
- Department of Electrical and Computer Engineering (ECE), The University of Alabama, Tuscaloosa, USA
| | - Tyson Marden
- Colorado Clinical and Translational Sciences Institute, University of Colorado Anschutz Medical Campus, Aurora, CO, USA
| | - Jason M Parton
- Department of Information Systems, Statistics, and Management Science, Culverhouse College of Commerce and Business Administration, University of Alabama, Tuscaloosa, AL, USA
| | - Janine A Higgins
- Department of Pediatrics, University of Colorado, Anschutz Medical Campus, Denver, CO, USA
| | - Megan A McCrory
- Department of Health Sciences, Boston University, Boston, MA, USA.
| | - Edward Sazonov
- Department of Electrical and Computer Engineering (ECE), The University of Alabama, Tuscaloosa, USA.
| |
Collapse
|