1
|
Dave D, Vyas K, Branan K, McKay S, DeSalvo DJ, Gutierrez-Osuna R, Cote GL, Erraguntla M. Detection of Hypoglycemia and Hyperglycemia Using Noninvasive Wearable Sensors: Electrocardiograms and Accelerometry. J Diabetes Sci Technol 2024; 18:351-362. [PMID: 35927975 PMCID: PMC10973850 DOI: 10.1177/19322968221116393] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Submit a Manuscript] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
Abstract
BACKGROUND Monitoring glucose excursions is important in diabetes management. This can be achieved using continuous glucose monitors (CGMs). However, CGMs are expensive and invasive. Thus, alternative low-cost noninvasive wearable sensors capable of predicting glycemic excursions could be a game changer to manage diabetes. METHODS In this article, we explore two noninvasive sensor modalities, electrocardiograms (ECGs) and accelerometers, collected on five healthy participants over two weeks, to predict both hypoglycemic and hyperglycemic excursions. We extract 29 features encompassing heart rate variability features from the ECG, and time- and frequency-domain features from the accelerometer. We evaluated two machine learning approaches to predict glycemic excursions: a classification model and a regression model. RESULTS The best model for both hypoglycemia and hyperglycemia detection was the regression model based on ECG and accelerometer data, yielding 76% sensitivity and specificity for hypoglycemia and 79% sensitivity and specificity for hyperglycemia. This had an improvement of 5% in sensitivity and specificity for both hypoglycemia and hyperglycemia when compared with using ECG data alone. CONCLUSIONS Electrocardiogram is a promising alternative not only to detect hypoglycemia but also to predict hyperglycemia. Supplementing ECG data with contextual information from accelerometer data can improve glucose prediction.
Collapse
Affiliation(s)
- Darpit Dave
- Wm Michael Barnes '64 Department of Industrial and Systems Engineering, Texas A&M University, College Station, TX, USA
| | - Kathan Vyas
- Department of Computer Science and Engineering, Texas A&M University, College Station, TX, USA
| | - Kimberly Branan
- Department of Biomedical Engineering, Texas A&M University, College Station, TX, USA
| | - Siripoom McKay
- Baylor College of Medicine, Houston, TX, USA
- Texas Children’s Hospital Clinical Care Center, Houston, TX, USA
| | - Daniel J. DeSalvo
- Baylor College of Medicine, Houston, TX, USA
- Texas Children’s Hospital Clinical Care Center, Houston, TX, USA
| | - Ricardo Gutierrez-Osuna
- Department of Computer Science and Engineering, Texas A&M University, College Station, TX, USA
| | - Gerard L. Cote
- Department of Biomedical Engineering, Texas A&M University, College Station, TX, USA
| | - Madhav Erraguntla
- Wm Michael Barnes '64 Department of Industrial and Systems Engineering, Texas A&M University, College Station, TX, USA
| |
Collapse
|
2
|
Concha-Pérez E, Gonzalez-Hernandez HG, Reyes-Avendaño JA. Physical Exertion Recognition Using Surface Electromyography and Inertial Measurements for Occupational Ergonomics. SENSORS (BASEL, SWITZERLAND) 2023; 23:9100. [PMID: 38005488 PMCID: PMC10674923 DOI: 10.3390/s23229100] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/15/2023] [Revised: 10/20/2023] [Accepted: 10/27/2023] [Indexed: 11/26/2023]
Abstract
By observing the actions taken by operators, it is possible to determine the risk level of a work task. One method for achieving this is the recognition of human activity using biosignals and inertial measurements provided to a machine learning algorithm performing such recognition. The aim of this research is to propose a method to automatically recognize physical exertion and reduce noise as much as possible towards the automation of the Job Strain Index (JSI) assessment by using a motion capture wearable device (MindRove armband) and training a quadratic support vector machine (QSVM) model, which is responsible for predicting the exertion depending on the patterns identified. The highest accuracy of the QSVM model was 95.7%, which was achieved by filtering the data, removing outliers and offsets, and performing zero calibration; in addition, EMG signals were normalized. It was determined that, given the job strain index's purpose, physical exertion detection is crucial to computing its intensity in future work.
Collapse
Affiliation(s)
| | - Hugo G. Gonzalez-Hernandez
- School of Engineering and Sciences, Tecnologico de Monterrey, Monterrey 64849, NL, Mexico; (E.C.-P.); (J.A.R.-A.)
| | | |
Collapse
|
3
|
Schall MC, Chen H, Cavuoto L. Wearable inertial sensors for objective kinematic assessments: A brief overview. JOURNAL OF OCCUPATIONAL AND ENVIRONMENTAL HYGIENE 2022; 19:501-508. [PMID: 35853137 DOI: 10.1080/15459624.2022.2100407] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Affiliation(s)
- Mark C Schall
- Department of Industrial and Systems Engineering, Auburn University, Auburn, Alabama
| | - Howard Chen
- Department of Mechanical Engineering, Auburn University, Auburn, Alabama
| | - Lora Cavuoto
- Department of Industrial and Systems Engineering, University at Buffalo, Buffalo, New York
| |
Collapse
|
4
|
Jahromi R, Zahed K, Sasangohar F, Erraguntla M, Mehta R, Qaraqe K. Hypoglycemia Detection Using Hand Tremors: A Home Study in Patients with Type 1 Diabetes (Preprint). JMIR Diabetes 2022; 8:e40990. [PMID: 37074783 PMCID: PMC10157461 DOI: 10.2196/40990] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/11/2022] [Revised: 01/26/2023] [Accepted: 02/20/2023] [Indexed: 02/22/2023] Open
Abstract
BACKGROUND Diabetes affects millions of people worldwide and is steadily increasing. A serious condition associated with diabetes is low glucose levels (hypoglycemia). Monitoring blood glucose is usually performed by invasive methods or intrusive devices, and these devices are currently not available to all patients with diabetes. Hand tremor is a significant symptom of hypoglycemia, as nerves and muscles are powered by blood sugar. However, to our knowledge, no validated tools or algorithms exist to monitor and detect hypoglycemic events via hand tremors. OBJECTIVE In this paper, we propose a noninvasive method to detect hypoglycemic events based on hand tremors using accelerometer data. METHODS We analyzed triaxial accelerometer data from a smart watch recorded from 33 patients with type 1 diabetes for 1 month. Time and frequency domain features were extracted from acceleration signals to explore different machine learning models to classify and differentiate between hypoglycemic and nonhypoglycemic states. RESULTS The mean duration of the hypoglycemic state was 27.31 (SD 5.15) minutes per day for each patient. On average, patients had 1.06 (SD 0.77) hypoglycemic events per day. The ensemble learning model based on random forest, support vector machines, and k-nearest neighbors had the best performance, with a precision of 81.5% and a recall of 78.6%. The results were validated using continuous glucose monitor readings as ground truth. CONCLUSIONS Our results indicate that the proposed approach can be a potential tool to detect hypoglycemia and can serve as a proactive, nonintrusive alert mechanism for hypoglycemic events.
Collapse
Affiliation(s)
- Reza Jahromi
- Industrial and Systems Engineering, Texas A&M University, College Station, TX, United States
- Department of Computer Science and Engineering, Texas A&M University, College Station, TX, United States
| | - Karim Zahed
- Industrial and Systems Engineering, Texas A&M University, College Station, TX, United States
| | - Farzan Sasangohar
- Industrial and Systems Engineering, Texas A&M University, College Station, TX, United States
- Center for Critical Care, Houston Methodist Hospital, Houston, TX, United States
| | - Madhav Erraguntla
- Industrial and Systems Engineering, Texas A&M University, College Station, TX, United States
| | - Ranjana Mehta
- Industrial and Systems Engineering, Texas A&M University, College Station, TX, United States
| | | |
Collapse
|
5
|
Lamooki SR, Hajifar S, Kang J, Sun H, Megahed FM, Cavuoto LA. A data analytic end-to-end framework for the automated quantification of ergonomic risk factors across multiple tasks using a single wearable sensor. APPLIED ERGONOMICS 2022; 102:103732. [PMID: 35287084 DOI: 10.1016/j.apergo.2022.103732] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/14/2021] [Revised: 02/18/2022] [Accepted: 02/24/2022] [Indexed: 06/14/2023]
Abstract
Existing ergonomic risk assessment tools require monitoring of multiple risk factors. To eliminate the direct observation, we investigated the effectiveness of an end-to-end framework that works with the data from a single wearable sensor. The framework is used to identify the performed task as the major contextual risk factor, and then estimate the task duration and number of repetitions as two main indicators of task intensity. For evaluation of the framework, we recruited 37 participants to complete 10 simulated work tasks in a laboratory setting. In testing, we achieved an average accuracy of 92% for task identification, 7.3% error in estimation of task duration, and 7.1% error for counting the number of task repetitions. Moreover, we showed the utility of the framework outputs in two ergonomic tools to estimate the risk of injury. Overall, we indicated the feasibility of using data from wearable sensors to automate the ergonomic risk assessment in workplaces.
Collapse
Affiliation(s)
- Saeb Ragani Lamooki
- Department of Mechanical Engineering, University at Buffalo, Buffalo, NY, 14260, USA.
| | - Sahand Hajifar
- Department of Industrial and Systems Engineering, University at Buffalo, Buffalo, NY, 14260, USA.
| | - Jiyeon Kang
- Department of Mechanical and Aerospace Engineering, University at Buffalo, Buffalo, NY, 14260, USA.
| | - Hongyue Sun
- Department of Industrial and Systems Engineering, University at Buffalo, Buffalo, NY, 14260, USA.
| | - Fadel M Megahed
- Farmer School of Business, Miami University, Oxford, OH, 45056, USA.
| | - Lora A Cavuoto
- Department of Industrial and Systems Engineering, University at Buffalo, Buffalo, NY, 14260, USA.
| |
Collapse
|
6
|
Muşat EC, Borz SA. Learning from Acceleration Data to Differentiate the Posture, Dynamic and Static Work of the Back: An Experimental Setup. Healthcare (Basel) 2022; 10:healthcare10050916. [PMID: 35628053 PMCID: PMC9140631 DOI: 10.3390/healthcare10050916] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/06/2022] [Revised: 05/13/2022] [Accepted: 05/13/2022] [Indexed: 02/04/2023] Open
Abstract
Information on body posture, postural change, and dynamic and static work is essential in understanding biomechanical exposure and has many applications in ergonomics and healthcare. This study aimed at evaluating the possibility of using triaxial acceleration data to classify postures and to differentiate between dynamic and static work of the back in an experimental setup, based on a machine learning (ML) approach. A movement protocol was designed to cover the essential degrees of freedom of the back, and a subject wearing a triaxial accelerometer implemented this protocol. Impulses and oscillations from the signals were removed by median filtering, then the filtered dataset was fed into two ML algorithms, namely a multilayer perceptron with back propagation (MLPBNN) and a random forest (RF), with the aim of inferring the most suitable algorithm and architecture for detecting dynamic and static work, as well as for correctly classifying the postures of the back. Then, training and testing subsets were delimitated and used to evaluate the learning and generalization ability of the ML algorithms for the same classification problems. The results indicate that ML has a lot of potential in differentiating between dynamic and static work, depending on the type of algorithm and its architecture, and the data quantity and quality. In particular, MLPBNN can be used to better differentiate between dynamic and static work when tuned properly. In addition, static work and the associated postures were better learned and generalized by the MLPBNN, a fact that could provide the basis for cheap real-world offline applications with the aim of getting time-scaled postural profiling data by accounting for the static postures. Although it wasn’t the case in this study, on bigger datasets, the use of MLPBPNN may come at the expense of high computational costs in the training phase. The study also discusses the factors that may improve the classification performance in the testing phase and sets new directions of research.
Collapse
|
7
|
Trkov M, Stevenson DT, Merryweather AS. Classifying hazardous movements and loads during manual materials handling using accelerometers and instrumented insoles. APPLIED ERGONOMICS 2022; 101:103693. [PMID: 35144123 PMCID: PMC8897225 DOI: 10.1016/j.apergo.2022.103693] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 10/13/2021] [Revised: 01/14/2022] [Accepted: 01/15/2022] [Indexed: 05/03/2023]
Abstract
Improper manual material handling (MMH) techniques are shown to lead to low back pain, the most common work-related musculoskeletal disorder. Due to the complex nature and variability of MMH and obtrusiveness and subjectiveness of existing hazard analysis methods, providing systematic, continuous, and automated risk assessment is challenging. We present a machine learning algorithm to detect and classify MMH tasks using minimally-intrusive instrumented insoles and chest-mounted accelerometers. Six participants performed standing, walking, lifting/lowering, carrying, side-to-side load transferring (i.e., 5.7 kg and 12.5 kg), and pushing/pulling. Lifting and carrying loads as well as hazardous behaviors (i.e., stooping, overextending and jerky lifting) were detected with 85.3%/81.5% average accuracies with/without chest accelerometer. The proposed system allows for continuous exposure assessment during MMH and provides objective data for use with analytical risk assessment models that can be used to increase workplace safety through exposure estimation.
Collapse
Affiliation(s)
- Mitja Trkov
- Department of Mechanical Engineering, Rowan University, Glassboro, NJ, 08028, United States.
| | - Duncan T Stevenson
- Department of Mechanical Engineering, Rowan University, Glassboro, NJ, 08028, United States.
| | - Andrew S Merryweather
- Department of Mechanical Engineering, The University of Utah, Salt Lake City, UT, 84112, United States; Rocky Mountain Center for Occupational and Environmental Health (RMCOEH), Salt Lake City, UT, 84108, United States.
| |
Collapse
|
8
|
Lee S, Liu L, Radwin R, Li J. Machine Learning in Manufacturing Ergonomics: Recent Advances, Challenges, and Opportunities. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3084881] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
9
|
Porta M, Kim S, Pau M, Nussbaum MA. Classifying diverse manual material handling tasks using a single wearable sensor. APPLIED ERGONOMICS 2021; 93:103386. [PMID: 33609851 DOI: 10.1016/j.apergo.2021.103386] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/28/2020] [Revised: 02/04/2021] [Accepted: 02/08/2021] [Indexed: 06/12/2023]
Abstract
The use of inertial measurement units (IMUs) for monitoring and classifying physical activities has received substantial attention in recent years, both in occupational and non-occupational contexts. However, a "user-friendly" approach is needed to promote this approach to quantify physical demands in actual workplaces. We explored the use of a single IMU for extracting information about different manual material handling (MMH) tasks (i.e., specific type of task performed, and associated duration and frequency), using a bidirectional long short-term memory network for classification. Classification performance using single IMUs placed on several body parts was compared with performance using multiple IMU configurations (2, 3, and 17 IMUs). Overall, the use of a single sensor led to satisfactory results (e.g., median accuracy >97%) in classifying MMH tasks and estimating task duration and frequency. Limited benefits were obtained using additional sensors, and several sensor locations yielded similar outcomes. Classification performance, though, was relatively inferior for push/pull vs. other tasks.
Collapse
Affiliation(s)
- Micaela Porta
- Department of Mechanical, Chemical and Materials Engineering, University of Cagliari, Italy
| | - Sunwook Kim
- Department of Industrial and System Engineering, Virginia Tech, Blacksburg, VA, USA
| | - Massimiliano Pau
- Department of Mechanical, Chemical and Materials Engineering, University of Cagliari, Italy
| | - Maury A Nussbaum
- Department of Industrial and System Engineering, Virginia Tech, Blacksburg, VA, USA.
| |
Collapse
|