1
|
Kwon J, Kwon O, Oh KT, Kim J, Yoo SK. Breathing-Associated Facial Region Segmentation for Thermal Camera-Based Indirect Breathing Monitoring. IEEE JOURNAL OF TRANSLATIONAL ENGINEERING IN HEALTH AND MEDICINE 2023; 11:505-514. [PMID: 37817827 PMCID: PMC10561734 DOI: 10.1109/jtehm.2023.3295775] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/06/2023] [Revised: 06/21/2023] [Accepted: 07/07/2023] [Indexed: 10/12/2023]
Abstract
Breathing can be measured in a non-contact method using a thermal camera. The objective of this study investigates non-contact breathing measurements using thermal cameras, which have previously been limited to measuring the nostril only from the front where it is clearly visible. The previous method is challenging to use for other angles and frontal views, where the nostril is not well-represented. In this paper, we defined a new region called the breathing-associated-facial-region (BAFR) that reflects the physiological characteristics of breathing, and extract breathing signals from views of 45 and 90 degrees, including the frontal view where the nostril is not clearly visible. Experiments were conducted on fifteen healthy subjects in different views, including frontal with and without nostril, 45-degree, and 90-degree views. A thermal camera (A655sc model, FLIR systems) was used for non-contact measurement, and biopac (MP150, Biopac-systems-Inc) was used as a chest breathing reference. The results showed that the proposed algorithm could extract stable breathing signals at various angles and views, achieving an average breathing cycle accuracy of 90.9% when applied compared to 65.6% without proposed algorithm. The average correlation value increases from 0.587 to 0.885. The proposed algorithm can be monitored in a variety of environments and extract the BAFR at diverse angles and views.
Collapse
Affiliation(s)
- Junhwan Kwon
- Department of Medical EngineeringYonsei University College of MedicineSeoul03722South Korea
| | - Oyun Kwon
- Department of Medical EngineeringYonsei University College of MedicineSeoul03722South Korea
| | - Kyeong Taek Oh
- Department of Medical EngineeringYonsei University College of MedicineSeoul03722South Korea
| | - Jeongmin Kim
- Department of Anesthesiology and Pain MedicineSeverance HospitalCollege of MedicineSeoul03722South Korea
| | - Sun K. Yoo
- Department of Medical EngineeringYonsei University College of MedicineSeoul03722South Korea
| |
Collapse
|
2
|
Zeng Y, Song X, Yang J, Wang W. Time-domain Features of Angular-velocity Signals for Camera-based Respiratory RoI detection: A Clinical Study in NICU. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-6. [PMID: 38083770 DOI: 10.1109/embc40787.2023.10340063] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Camera-based measurement of respiratory rate (RR) is emerging for preterm infants monitoring in Neonatal Intensive Care Units (NICU). Accurate detection of respiratory region of interest (Resp-RoI), e.g. thorax and abdomen of infants, is essential for achieving a fully-automatic solution and for high-quality RR estimation. However, the application of fast Fourier transform (FFT) for detecting Resp-RoI in premature infants may not be appropriate due to their irregular breathing patterns. This study proposes a new method for detecting Resp-RoIs in premature infants that uses time-domain features of angular-velocity of respiration. By fusing respiratory motion on orthogonal directions, the proposed method is more robust to variations of infant posture in the incubator.. In addition, using inter-beat interval (IBI) features in the time domain helps to distinguish between Resp-RoI and background. The proposed method was validated on 20 preterm infants in NICU. It obtains a clear improvement on Resp-RoI detection (RoI correspondence = 0.74) and RR estimation (MAE = 3.62 bpm) against the benchmarked approaches (maxFFT: RoI correspondence = 0.45, MAE = 5.61 bpm).
Collapse
|
3
|
van Meulen FB, Grassi A, van den Heuvel L, Overeem S, van Gilst MM, van Dijk JP, Maass H, van Gastel MJH, Fonseca P. Contactless Camera-Based Sleep Staging: The HealthBed Study. BIOENGINEERING (BASEL, SWITZERLAND) 2023; 10:bioengineering10010109. [PMID: 36671681 PMCID: PMC9855193 DOI: 10.3390/bioengineering10010109] [Citation(s) in RCA: 8] [Impact Index Per Article: 8.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 11/25/2022] [Revised: 01/06/2023] [Accepted: 01/10/2023] [Indexed: 01/15/2023]
Abstract
Polysomnography (PSG) remains the gold standard for sleep monitoring but is obtrusive in nature. Advances in camera sensor technology and data analysis techniques enable contactless monitoring of heart rate variability (HRV). In turn, this may allow remote assessment of sleep stages, as different HRV metrics indirectly reflect the expression of sleep stages. We evaluated a camera-based remote photoplethysmography (PPG) setup to perform automated classification of sleep stages in near darkness. Based on the contactless measurement of pulse rate variability, we use a previously developed HRV-based algorithm for 3 and 4-class sleep stage classification. Performance was evaluated on data of 46 healthy participants obtained from simultaneous overnight recording of PSG and camera-based remote PPG. To validate the results and for benchmarking purposes, the same algorithm was used to classify sleep stages based on the corresponding ECG data. Compared to manually scored PSG, the remote PPG-based algorithm achieved moderate agreement on both 3 class (Wake-N1/N2/N3-REM) and 4 class (Wake-N1/N2-N3-REM) classification, with average κ of 0.58 and 0.49 and accuracy of 81% and 68%, respectively. This is in range with other performance metrics reported on sensing technologies for wearable sleep staging, showing the potential of video-based non-contact sleep staging.
Collapse
Affiliation(s)
- Fokke B. van Meulen
- Department of Electrical Engineering, Eindhoven University of Technology, 5612 AZ Eindhoven, The Netherlands
- Sleep Medicine Center Kempenhaeghe, 5591 VE Heeze, The Netherlands
- Correspondence:
| | - Angela Grassi
- Philips Research, 5656 AE Eindhoven, The Netherlands
| | | | - Sebastiaan Overeem
- Department of Electrical Engineering, Eindhoven University of Technology, 5612 AZ Eindhoven, The Netherlands
- Sleep Medicine Center Kempenhaeghe, 5591 VE Heeze, The Netherlands
| | - Merel M. van Gilst
- Department of Electrical Engineering, Eindhoven University of Technology, 5612 AZ Eindhoven, The Netherlands
- Sleep Medicine Center Kempenhaeghe, 5591 VE Heeze, The Netherlands
| | - Johannes P. van Dijk
- Department of Electrical Engineering, Eindhoven University of Technology, 5612 AZ Eindhoven, The Netherlands
- Sleep Medicine Center Kempenhaeghe, 5591 VE Heeze, The Netherlands
| | - Henning Maass
- Department of Electrical Engineering, Eindhoven University of Technology, 5612 AZ Eindhoven, The Netherlands
- Philips Research, 5656 AE Eindhoven, The Netherlands
| | - Mark J. H. van Gastel
- Department of Electrical Engineering, Eindhoven University of Technology, 5612 AZ Eindhoven, The Netherlands
- Philips Research, 5656 AE Eindhoven, The Netherlands
| | - Pedro Fonseca
- Department of Electrical Engineering, Eindhoven University of Technology, 5612 AZ Eindhoven, The Netherlands
- Philips Research, 5656 AE Eindhoven, The Netherlands
| |
Collapse
|