1
|
Dénes-Fazakas L, Simon B, Hartvég Á, Szilágyi L, Kovács L, Mosavi A, Eigner G. Personalized food consumption detection with deep learning and Inertial Measurement Unit sensor. Comput Biol Med 2024; 182:109167. [PMID: 39326266 DOI: 10.1016/j.compbiomed.2024.109167] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/17/2024] [Revised: 09/16/2024] [Accepted: 09/17/2024] [Indexed: 09/28/2024]
Abstract
For individuals diagnosed with diabetes mellitus, it is crucial to keep a record of the carbohydrates consumed during meals, as this should be done at least three times daily, amounting to an average of six meals. Unfortunately, many individuals tend to overlook this essential task. For those who use an artificial pancreas, carbohydrate intake proves to be a critical factor, as it can activate the insulin pump in the artificial pancreas to deliver insulin to the body. To address this need, we have developed personalized deep learning model that can accurately detect carbohydrate intake with a high degree of accuracy. Our study employed a publicly available dataset gathered by an Inertial Measurement Unit (IMU), which included accelerometer and gyroscope data. The data was sampled at a rate of 15 Hz, necessitating preprocessing. For our tailored to the patient model, we utilized a recurrent network comprising Long short-term memory (LSTM) layers. Our findings revealed a median F1 score of 0.99, indicating a high level of accuracy. Additionally, the confusion matrix displayed a difference of only 6 s, further validating the model's accuracy. Therefore, we can confidently assert that our model architecture exhibits a high degree of accuracy. Our model performed well above 90% on the dataset, with most results between 98%-99%. The recurrent networks improved the problem-solving capabilities significantly, though some outliers remained. The model's average prediction latency was 5.5 s, suggesting that later meal predictions result in extended meal progress predictions. The dataset's limitation of mostly single-day data points raises questions about multi-day performance, which could be explored by collecting multi-day data, including night periods. Future enhancements might involve transformer networks and shorter time windows to improve model responsiveness and accuracy. Therefore, we can confidently assert that our model exhibits a high degree of accuracy.
Collapse
Affiliation(s)
- Lehel Dénes-Fazakas
- Physiological Controls Research Center, University Research and Innovation Center, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary; Biomatics and Applied Artificial Intelligence Institute, John von Neumann Faculty of Informatics, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary; Doctoral School of Applied Informatics and Applied Mathematics, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary.
| | - Barbara Simon
- Biomatics and Applied Artificial Intelligence Institute, John von Neumann Faculty of Informatics, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary.
| | - Ádám Hartvég
- Biomatics and Applied Artificial Intelligence Institute, John von Neumann Faculty of Informatics, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary.
| | - László Szilágyi
- Physiological Controls Research Center, University Research and Innovation Center, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary; Biomatics and Applied Artificial Intelligence Institute, John von Neumann Faculty of Informatics, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary; Computational Intelligence Research Group, Sapientia Hungarian University of Transylvania, Tirgu Mures, Romania
| | - Levente Kovács
- Physiological Controls Research Center, University Research and Innovation Center, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary; Biomatics and Applied Artificial Intelligence Institute, John von Neumann Faculty of Informatics, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary.
| | - Amir Mosavi
- Biomatics and Applied Artificial Intelligence Institute, John von Neumann Faculty of Informatics, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary.
| | - György Eigner
- Physiological Controls Research Center, University Research and Innovation Center, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary; Biomatics and Applied Artificial Intelligence Institute, John von Neumann Faculty of Informatics, Obuda University, Bécsi út 96/b, Budapest, 1034, Hungary.
| |
Collapse
|
2
|
Smart Piezoelectric-Based Wearable System for Calorie Intake Estimation Using Machine Learning. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12126135] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/04/2022]
Abstract
Eating an appropriate food volume, maintaining the required calorie count, and making good nutritional choices are key factors for reducing the risk of obesity, which has many consequences such as Osteoarthritis (OA) that affects the patient’s knee. In this paper, we present a wearable sensor in the form of a necklace embedded with a piezoelectric sensor, that detects skin movement from the lower trachea while eating. In contrast to the previous state-of-the-art piezoelectric sensor-based system that used spectral features, our system fully exploits temporal amplitude-varying signals for optimal features, and thus classifies foods more accurately. Through evaluation of the frame length and the position of swallowing in the frame, we found the best performance was with a frame length of 30 samples (1.5 s), with swallowing located towards the end of the frame. This demonstrates that the chewing sequence carries important information for classification. Additionally, we present a new approach in which the weight of solid food can be estimated from the swallow count, and the calorie count of food can be calculated from their estimated weight. Our system based on a smartphone app helps users live healthily by providing them with real-time feedback about their ingested food types, volume, and calorie count.
Collapse
|
3
|
Sharma S, Hoover A. Top-Down Detection of Eating Episodes by Analyzing Large Windows of Wrist Motion Using a Convolutional Neural Network. Bioengineering (Basel) 2022; 9:bioengineering9020070. [PMID: 35200423 PMCID: PMC8869422 DOI: 10.3390/bioengineering9020070] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/15/2021] [Revised: 01/19/2022] [Accepted: 02/07/2022] [Indexed: 11/16/2022] Open
Abstract
In this work, we describe a new method to detect periods of eating by tracking wrist motion during everyday life. Eating uses hand-to-mouth gestures for ingestion, each of which lasts a few seconds. Previous works have detected these gestures individually and then aggregated them to identify meals. The novelty of our approach is that we analyze a much longer window (0.5–15 min) using a convolutional neural network. Longer windows can contain other gestures related to eating, such as cutting or manipulating food, preparing foods for consumption, and resting between ingestion events. The context of these other gestures can improve the detection of periods of eating. We test our methods on the public Clemson all-day dataset, which consists of 354 recordings containing 1063 eating episodes. We found that accuracy at detecting eating increased by 15% in ≥4 min windows compared to ≤15 s windows. Using a 6 min window, we detected 89% of eating episodes, with 1.7 false positives for every true positive (FP/TP). These are the best results achieved to date on this dataset.
Collapse
|
4
|
Stankoski S, Jordan M, Gjoreski H, Luštrek M. Smartwatch-Based Eating Detection: Data Selection for Machine Learning from Imbalanced Data with Imperfect Labels. SENSORS (BASEL, SWITZERLAND) 2021; 21:1902. [PMID: 33803121 PMCID: PMC7963188 DOI: 10.3390/s21051902] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/31/2021] [Revised: 03/02/2021] [Accepted: 03/04/2021] [Indexed: 11/16/2022]
Abstract
Understanding people's eating habits plays a crucial role in interventions promoting a healthy lifestyle. This requires objective measurement of the time at which a meal takes place, the duration of the meal, and what the individual eats. Smartwatches and similar wrist-worn devices are an emerging technology that offers the possibility of practical and real-time eating monitoring in an unobtrusive, accessible, and affordable way. To this end, we present a novel approach for the detection of eating segments with a wrist-worn device and fusion of deep and classical machine learning. It integrates a novel data selection method to create the training dataset, and a method that incorporates knowledge from raw and virtual sensor modalities for training with highly imbalanced datasets. The proposed method was evaluated using data from 12 subjects recorded in the wild, without any restriction about the type of meals that could be consumed, the cutlery used for the meal, or the location where the meal took place. The recordings consist of data from accelerometer and gyroscope sensors. The experiments show that our method for detection of eating segments achieves precision of 0.85, recall of 0.81, and F1-score of 0.82 in a person-independent manner. The results obtained in this study indicate that reliable eating detection using in the wild recorded data is possible with the use of wearable sensors on the wrist.
Collapse
Affiliation(s)
- Simon Stankoski
- Department of Intelligent Systems, Jožef Stefan Institute, 1000 Ljubljana, Slovenia; (M.J.); (M.L.)
- Jožef Stefan International Postgraduate School, 1000 Ljubljana, Slovenia
| | - Marko Jordan
- Department of Intelligent Systems, Jožef Stefan Institute, 1000 Ljubljana, Slovenia; (M.J.); (M.L.)
| | - Hristijan Gjoreski
- Faculty of Electrical Engineering and Information Technologies, Ss. Cyril and Methodius University, 1000 Skopje, North Macedonia;
| | - Mitja Luštrek
- Department of Intelligent Systems, Jožef Stefan Institute, 1000 Ljubljana, Slovenia; (M.J.); (M.L.)
- Jožef Stefan International Postgraduate School, 1000 Ljubljana, Slovenia
| |
Collapse
|
5
|
Konstantinidis D, Dimitropoulos K, Langlet B, Daras P, Ioakimidis I. Validation of a Deep Learning System for the Full Automation of Bite and Meal Duration Analysis of Experimental Meal Videos. Nutrients 2020; 12:E209. [PMID: 31941145 PMCID: PMC7020058 DOI: 10.3390/nu12010209] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/12/2019] [Revised: 01/09/2020] [Accepted: 01/09/2020] [Indexed: 12/23/2022] Open
Abstract
Eating behavior can have an important effect on, and be correlated with, obesity and eating disorders. Eating behavior is usually estimated through self-reporting measures, despite their limitations in reliability, based on ease of collection and analysis. A better and widely used alternative is the objective analysis of eating during meals based on human annotations of in-meal behavioral events (e.g., bites). However, this methodology is time-consuming and often affected by human error, limiting its scalability and cost-effectiveness for large-scale research. To remedy the latter, a novel "Rapid Automatic Bite Detection" (RABiD) algorithm that extracts and processes skeletal features from videos was trained in a video meal dataset (59 individuals; 85 meals; three different foods) to automatically measure meal duration and bites. In these settings, RABiD achieved near perfect agreement between algorithmic and human annotations (Cohen's kappa κ = 0.894; F1-score: 0.948). Moreover, RABiD was used to analyze an independent eating behavior experiment (18 female participants; 45 meals; three different foods) and results showed excellent correlation between algorithmic and human annotations. The analyses revealed that, despite the changes in food (hash vs. meatballs), the total meal duration remained the same, while the number of bites were significantly reduced. Finally, a descriptive meal-progress analysis revealed that different types of food affect bite frequency, although overall bite patterns remain similar (the outcomes were the same for RABiD and manual). Subjects took bites more frequently at the beginning and the end of meals but were slower in-between. On a methodological level, RABiD offers a valid, fully automatic alternative to human meal-video annotations for the experimental analysis of human eating behavior, at a fraction of the cost and the required time, without any loss of information and data fidelity.
Collapse
Affiliation(s)
| | - Kosmas Dimitropoulos
- Visual Computing Lab, CERTH-ITI, 57001 Thessaloniki, Greece; (D.K.); (K.D.); (P.D.)
| | - Billy Langlet
- Innovative Use of Mobile Phones to Promote Physical Activity and Nutrition across the Lifespan (the IMPACT) Research Group, Department of Biosciences and Nutrition, Karolinska Institutet, 14152 Stockholm, Sweden;
| | - Petros Daras
- Visual Computing Lab, CERTH-ITI, 57001 Thessaloniki, Greece; (D.K.); (K.D.); (P.D.)
| | - Ioannis Ioakimidis
- Innovative Use of Mobile Phones to Promote Physical Activity and Nutrition across the Lifespan (the IMPACT) Research Group, Department of Biosciences and Nutrition, Karolinska Institutet, 14152 Stockholm, Sweden;
| |
Collapse
|
6
|
Kyritsis K, Diou C, Delopoulos A. End-to-end Learning for Measuring in-meal Eating Behavior from a Smartwatch. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2019; 2018:5511-5514. [PMID: 30441585 DOI: 10.1109/embc.2018.8513627] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
In this paper, we propose an end-to-end neural network (NN) architecture for detecting in-meal eating events (i.e., bites), using only a commercially available smartwatch. Our method combines convolutional and recurrent networks and is able to simultaneously learn intermediate data representations related to hand movements, as well as sequences of these movements that appear during eating. A promising F-score of 0.884 is achieved for detecting bites on a publicly available dataset with 10 subjects.
Collapse
|
7
|
Eating and Drinking Recognition in Free-Living Conditions for Triggering Smart Reminders. SENSORS 2019; 19:s19122803. [PMID: 31234499 PMCID: PMC6631238 DOI: 10.3390/s19122803] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/13/2019] [Revised: 06/06/2019] [Accepted: 06/18/2019] [Indexed: 11/17/2022]
Abstract
The increasingly aging society in developed countries has raised attention to the role of technology in seniors’ lives, namely concerning isolation-related issues. Independent seniors that live alone frequently neglect meals, hydration and proper medication-taking behavior. This work aims at eating and drinking recognition in free-living conditions for triggering smart reminders to autonomously living seniors, keeping system design considerations, namely usability and senior-acceptance criteria, in the loop. To that end, we conceived a new dataset featuring accelerometer and gyroscope wrist data to conduct the experiments. We assessed the performance of a single multi-class classification model when compared against several binary classification models, one for each activity of interest (eating vs. non-eating; drinking vs. non-drinking). Binary classification models performed consistently better for all tested classifiers (k-NN, Naive Bayes, Decision Tree, Multilayer Perceptron, Random Forests, HMM). This evidence supported the proposal of a semi-hierarchical activity recognition algorithm that enabled the implementation of two distinct data stream segmentation techniques, the customization of the classification models of each activity of interest and the establishment of a set of restrictions to apply on top of the classification output, based on daily evidence. An F1-score of 97% was finally attained for the simultaneous recognition of eating and drinking in an all-day acquisition from one young user, and 93% in a test set with 31 h of data from 5 different unseen users, 2 of which were seniors. These results were deemed very promising towards solving the problem of food and fluids intake monitoring with practical systems which shall maximize user-acceptance.
Collapse
|
8
|
Heydarian H, Adam M, Burrows T, Collins C, Rollo ME. Assessing Eating Behaviour Using Upper Limb Mounted Motion Sensors: A Systematic Review. Nutrients 2019; 11:E1168. [PMID: 31137677 PMCID: PMC6566929 DOI: 10.3390/nu11051168] [Citation(s) in RCA: 21] [Impact Index Per Article: 4.2] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/28/2019] [Revised: 05/21/2019] [Accepted: 05/22/2019] [Indexed: 01/08/2023] Open
Abstract
Wearable motion tracking sensors are now widely used to monitor physical activity, and have recently gained more attention in dietary monitoring research. The aim of this review is to synthesise research to date that utilises upper limb motion tracking sensors, either individually or in combination with other technologies (e.g., cameras, microphones), to objectively assess eating behaviour. Eleven electronic databases were searched in January 2019, and 653 distinct records were obtained. Including 10 studies found in backward and forward searches, a total of 69 studies met the inclusion criteria, with 28 published since 2017. Fifty studies were conducted exclusively in laboratory settings, 13 exclusively in free-living settings, and three in both settings. The most commonly used motion sensor was an accelerometer (64) worn on the wrist (60) or lower arm (5), while in most studies (45), accelerometers were used in combination with gyroscopes. Twenty-six studies used commercial-grade smartwatches or fitness bands, 11 used professional grade devices, and 32 used standalone sensor chipsets. The most used machine learning approaches were Support Vector Machine (SVM, n = 21), Random Forest (n = 19), Decision Tree (n = 16), Hidden Markov Model (HMM, n = 10) algorithms, and from 2017 Deep Learning (n = 5). While comparisons of the detection models are not valid due to the use of different datasets, the models that consider the sequential context of data across time, such as HMM and Deep Learning, show promising results for eating activity detection. We discuss opportunities for future research and emerging applications in the context of dietary assessment and monitoring.
Collapse
Affiliation(s)
- Hamid Heydarian
- School of Electrical Engineering and Computing, Faculty of Engineering and Built Environment, The University of Newcastle, Callaghan, NSW 2308, Australia.
| | - Marc Adam
- School of Electrical Engineering and Computing, Faculty of Engineering and Built Environment, The University of Newcastle, Callaghan, NSW 2308, Australia.
- Priority Research Centre for Physical Activity and Nutrition, The University of Newcastle, Callaghan, NSW 2308, Australia.
| | - Tracy Burrows
- Priority Research Centre for Physical Activity and Nutrition, The University of Newcastle, Callaghan, NSW 2308, Australia.
- School of Health Sciences, Faculty of Health and Medicine, The University of Newcastle, Callaghan, NSW 2308, Australia.
| | - Clare Collins
- Priority Research Centre for Physical Activity and Nutrition, The University of Newcastle, Callaghan, NSW 2308, Australia.
- School of Health Sciences, Faculty of Health and Medicine, The University of Newcastle, Callaghan, NSW 2308, Australia.
| | - Megan E Rollo
- Priority Research Centre for Physical Activity and Nutrition, The University of Newcastle, Callaghan, NSW 2308, Australia.
- School of Health Sciences, Faculty of Health and Medicine, The University of Newcastle, Callaghan, NSW 2308, Australia.
| |
Collapse
|
9
|
Gomes D, Sousa I. Real-Time Drink Trigger Detection in Free-living Conditions Using Inertial Sensors. SENSORS 2019; 19:s19092145. [PMID: 31075843 PMCID: PMC6539019 DOI: 10.3390/s19092145] [Citation(s) in RCA: 15] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/21/2019] [Revised: 04/30/2019] [Accepted: 05/01/2019] [Indexed: 02/02/2023]
Abstract
Despite the importance of maintaining an adequate hydration status, water intake is frequently neglected due to the fast pace of people’s lives. For the elderly, poor water intake can be even more concerning, not only due to the damaging impact of dehydration, but also since seniors’ hydration regulation mechanisms tend to be less efficient. This work focuses on the recognition of the pre-drinking hand-to-mouth movement (a drink trigger) with two main objectives: predict the occurrence of drinking events in real-time and free-living conditions, and assess the potential of using this method to trigger an external component for estimating the amount of fluid intake. This shall contribute towards the efficiency of more robust multimodal approaches addressing the problem of water intake monitoring. The system, based on a single inertial measurement unit placed on the forearm, is unobtrusive, user-independent, and lightweight enough for real-time mobile processing. Drinking events outside meal periods were detected with an F-score of 97% in an offline validation with data from 12 users, and 85% in a real-time free-living validation with five other subjects, using a random forest classifier. Our results also reveal that the algorithm first detects the hand-to-mouth movement 0.70 s before the occurrence of the actual sip of the drink, proving that this approach can have further applications and enable more robust and complete fluid intake monitoring solutions.
Collapse
Affiliation(s)
- Diana Gomes
- Fraunhofer Portugal AICOS, 4200-135 Porto, Portugal.
| | - Inês Sousa
- Fraunhofer Portugal AICOS, 4200-135 Porto, Portugal.
| |
Collapse
|
10
|
Kyritsis K, Diou C, Delopoulos A. Modeling Wrist Micromovements to Measure In-Meal Eating Behavior From Inertial Sensor Data. IEEE J Biomed Health Inform 2019; 23:2325-2334. [PMID: 30629523 DOI: 10.1109/jbhi.2019.2892011] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
Overweight and obesity are both associated with in-meal eating parameters such as eating speed. Recently, the plethora of available wearable devices in the market ignited the interest of both the scientific community and the industry toward unobtrusive solutions for eating behavior monitoring. In this paper, we present an algorithm for automatically detecting the in-meal food intake cycles using the inertial signals (acceleration and orientation velocity) from an off-the-shelf smartwatch. We use five specific wrist micromovements to model the series of actions leading to and following an intake event (i.e., bite). Food intake detection is performed in two steps. In the first step, we process windows of raw sensor streams and estimate their micromovement probability distributions by means of a convolutional neural network. In the second step, we use a long short-term memory network to capture the temporal evolution and classify sequences of windows as food intake cycles. Evaluation is performed using a challenging dataset of 21 meals from 12 subjects. In our experiments, we compare the performance of our algorithm against three state-of-the-art approaches, where our approach achieves the highest F1 detection score (0.913 in the leave-one-subject-out experiment). The dataset used in the experiments is available at https://mug.ee.auth.gr/intake-cycle-detection/.
Collapse
|
11
|
Kyritsis K, Tatli CL, Diou C, Delopoulos A. Automated analysis of in meal eating behavior using a commercial wristband IMU sensor. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2018; 2017:2843-2846. [PMID: 29060490 DOI: 10.1109/embc.2017.8037449] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Automatic objective monitoring of eating behavior using inertial sensors is a research problem that has received a lot of attention recently, mainly due to the mass availability of IMUs and the evidence on the importance of quantifying and monitoring eating patterns. In this paper we propose a method for detecting food intake cycles during the course of a meal using a commercially available wristband. We first model micro-movements that are part of the intake cycle and then use HMMs to model the sequences of micro-movements leading to mouthfuls. Evaluation is carried out on an annotated dataset of 8 subjects where the proposed method achieves 0:78 precision and 0:77 recall. The evaluation dataset is publicly available at http://mug.ee.auth.gr/intake-cycle-detection/.
Collapse
|
12
|
Ramos-Garcia RI, Tiffany S, Sazonov E. Using respiratory signals for the recognition of human activities. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2017; 2016:173-176. [PMID: 28268307 DOI: 10.1109/embc.2016.7590668] [Citation(s) in RCA: 9] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Human activity recognition through wearable sensors is becoming integral to health monitoring and other applications. Typically, human activity is captured through signals from inertial sensors, while signals from other sensors have been utilized less frequently. In this study, we explored the feasibility of classifying human activities by analyzing the temporal information of respiratory signals through hidden Markov models (HMMs). Left-to-right HMMs were trained for five activities: sedentary, walking, eating, talking, and cigarette smoking. The temporal information from every breathing segment was captured by fragmenting the tidal volume and airflow signals into smaller frames and computing features for each frame. These frames were used as observations to model the states of the HMMs through mixture of Gaussians. Using leave-one-out cross-validation, the classification performance showed an average precision, recall, and F-score of 60.37%, 67.01%, and 62.78%, respectively. Results suggest that respiratory signals can potentially be used as a primary or secondary source in the recognition of some human activities.
Collapse
|
13
|
Salley J, Muth E, Hoover A. Assessing the Accuracy of a Wrist Motion Tracking Method for Counting Bites Across Demographic and Food Variables. IEEE J Biomed Health Inform 2016; 21:599-606. [PMID: 28113994 DOI: 10.1109/jbhi.2016.2612580] [Citation(s) in RCA: 29] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/21/2023]
Abstract
This paper describes a study to test the accuracy of a method that tracks wrist motion during eating to detect and count bites. The purpose was to assess its accuracy across demographic (age, gender, and ethnicity) and bite (utensil, container, hand used, and food type) variables. Data were collected in a cafeteria under normal eating conditions. A total of 271 participants ate a single meal while wearing a watch-like device to track their wrist motion. A video was simultaneously recorded of each participant and subsequently reviewed to determine the ground truth times of bites. Bite times were operationally defined as the moment when food or beverage was placed into the mouth. Food and beverage choices were not scripted or restricted. Participants were seated in groups of 2-4 and were encouraged to eat naturally. A total of 24 088 bites of 374 different food and beverage items were consumed. Overall the method for automatically detecting bites had a sensitivity of 75% with a positive predictive value of 89%. A range of 62-86% sensitivity was found across demographic variables with slower eating rates trending toward higher sensitivity. Variations in sensitivity due to food type showed a modest correlation with the total wrist motion during the bite, possibly due to an increase in head-toward-plate motion and decrease in hand-toward-mouth motion for some food types. Overall, the findings provide the largest evidence to date that the method produces a reliable automated measure of intake during unrestricted eating.
Collapse
|