1
|
Ahmed S, Yoon S, Cho SH. A public dataset of dogs vital signs recorded with ultra wideband radar and reference sensors. Sci Data 2024; 11:107. [PMID: 38253685 PMCID: PMC10803748 DOI: 10.1038/s41597-024-02947-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/30/2023] [Accepted: 01/10/2024] [Indexed: 01/24/2024] Open
Abstract
Recently, radar sensors have been extensively used for vital sign monitoring in dogs, owing to their noncontact and noninvasive nature. However, a public dataset on dog vital signs has yet to be proposed since capturing data from dogs requires special training and approval. This work presents the first ever ultra wideband radar-based dog vital sign (UWB-DVS) dataset, which was captured in two independent scenarios. In the first scenario, clinical reference sensors are attached to the fainted dogs, and data from UWB radar and reference sensors are captured synchronously. In the second scenario, the dogs can move freely, and video recordings are provided as a reference for movement detection and breathing extraction. For technical validation, a high correlation, above 0.9, is found between the radar and clinical reference sensors for both the heart rate and breathing rate measurements in scenario 1. In scenario 2, the vital signs and movement of the dogs are shown in the form of dashboards, demonstrating the long-term monitoring capability of the radar sensor.
Collapse
Affiliation(s)
- Shahzad Ahmed
- Department of Electronic Engineering, Hanyang University, Seoul, 04763, South Korea
| | - Seongkwon Yoon
- Department of Electronic Engineering, Hanyang University, Seoul, 04763, South Korea
| | - Sung Ho Cho
- Department of Electronic Engineering, Hanyang University, Seoul, 04763, South Korea.
| |
Collapse
|
2
|
Shi D, Liang F, Qiao J, Wang Y, Zhu Y, Lv H, Yu X, Jiao T, Liao F, Yan K, Wang J, Zhang Y. A Novel Non-Contact Detection and Identification Method for the Post-Disaster Compression State of Injured Individuals Using UWB Bio-Radar. Bioengineering (Basel) 2023; 10:905. [PMID: 37627790 PMCID: PMC10451469 DOI: 10.3390/bioengineering10080905] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2023] [Revised: 07/22/2023] [Accepted: 07/23/2023] [Indexed: 08/27/2023] Open
Abstract
Building collapse leads to mechanical injury, which is the main cause of injury and death, with crush syndrome as its most common complication. During the post-disaster search and rescue phase, if rescue personnel hastily remove heavy objects covering the bodies of injured individuals and fail to provide targeted medical care, ischemia-reperfusion injury may be triggered, leading to rhabdomyolysis. This may result in disseminated intravascular coagulation or acute respiratory distress syndrome, further leading to multiple organ failure, which ultimately leads to shock and death. Using bio-radar to detect vital signs and identify compression states can effectively reduce casualties during the search for missing persons behind obstacles. A time-domain ultra-wideband (UWB) bio-radar was applied for the non-contact detection of human vital sign signals behind obstacles. An echo denoising algorithm based on PSO-VMD and permutation entropy was proposed to suppress environmental noise, along with a wounded compression state recognition network based on radar-life signals. Based on training and testing using over 3000 data sets from 10 subjects in different compression states, the proposed multiscale convolutional network achieved a 92.63% identification accuracy. This outperformed SVM and 1D-CNN models by 5.30% and 6.12%, respectively, improving the casualty rescue success and post-disaster precision.
Collapse
Affiliation(s)
- Ding Shi
- Department of Medical Electronics, School of Biomedical Engineering, Air Force Medical University, Xi’an 710032, China; (D.S.); (F.L.); (J.Q.); (Y.W.); (Y.Z.); (H.L.); (X.Y.); (T.J.)
- Shaanxi Provincial Key Laboratory of Bioelectromagnetic Detection and Intelligent Perception, Air Force Medical University, Xi’an 710032, China
| | - Fulai Liang
- Department of Medical Electronics, School of Biomedical Engineering, Air Force Medical University, Xi’an 710032, China; (D.S.); (F.L.); (J.Q.); (Y.W.); (Y.Z.); (H.L.); (X.Y.); (T.J.)
- Shaanxi Provincial Key Laboratory of Bioelectromagnetic Detection and Intelligent Perception, Air Force Medical University, Xi’an 710032, China
| | - Jiahao Qiao
- Department of Medical Electronics, School of Biomedical Engineering, Air Force Medical University, Xi’an 710032, China; (D.S.); (F.L.); (J.Q.); (Y.W.); (Y.Z.); (H.L.); (X.Y.); (T.J.)
- Shaanxi Provincial Key Laboratory of Bioelectromagnetic Detection and Intelligent Perception, Air Force Medical University, Xi’an 710032, China
| | - Yaru Wang
- Department of Medical Electronics, School of Biomedical Engineering, Air Force Medical University, Xi’an 710032, China; (D.S.); (F.L.); (J.Q.); (Y.W.); (Y.Z.); (H.L.); (X.Y.); (T.J.)
- Department of Biomedical Engineering, School of Electronic and Information Engineering, Xi’an Technological University, Xi’an 710032, China; (F.L.); (K.Y.)
| | - Yidan Zhu
- Department of Medical Electronics, School of Biomedical Engineering, Air Force Medical University, Xi’an 710032, China; (D.S.); (F.L.); (J.Q.); (Y.W.); (Y.Z.); (H.L.); (X.Y.); (T.J.)
- Department of Biomedical Engineering, School of Electronic and Information Engineering, Xi’an Technological University, Xi’an 710032, China; (F.L.); (K.Y.)
| | - Hao Lv
- Department of Medical Electronics, School of Biomedical Engineering, Air Force Medical University, Xi’an 710032, China; (D.S.); (F.L.); (J.Q.); (Y.W.); (Y.Z.); (H.L.); (X.Y.); (T.J.)
- Shaanxi Provincial Key Laboratory of Bioelectromagnetic Detection and Intelligent Perception, Air Force Medical University, Xi’an 710032, China
| | - Xiao Yu
- Department of Medical Electronics, School of Biomedical Engineering, Air Force Medical University, Xi’an 710032, China; (D.S.); (F.L.); (J.Q.); (Y.W.); (Y.Z.); (H.L.); (X.Y.); (T.J.)
- Shaanxi Provincial Key Laboratory of Bioelectromagnetic Detection and Intelligent Perception, Air Force Medical University, Xi’an 710032, China
| | - Teng Jiao
- Department of Medical Electronics, School of Biomedical Engineering, Air Force Medical University, Xi’an 710032, China; (D.S.); (F.L.); (J.Q.); (Y.W.); (Y.Z.); (H.L.); (X.Y.); (T.J.)
- Shaanxi Provincial Key Laboratory of Bioelectromagnetic Detection and Intelligent Perception, Air Force Medical University, Xi’an 710032, China
| | - Fuyuan Liao
- Department of Biomedical Engineering, School of Electronic and Information Engineering, Xi’an Technological University, Xi’an 710032, China; (F.L.); (K.Y.)
| | - Keding Yan
- Department of Biomedical Engineering, School of Electronic and Information Engineering, Xi’an Technological University, Xi’an 710032, China; (F.L.); (K.Y.)
| | - Jianqi Wang
- Department of Medical Electronics, School of Biomedical Engineering, Air Force Medical University, Xi’an 710032, China; (D.S.); (F.L.); (J.Q.); (Y.W.); (Y.Z.); (H.L.); (X.Y.); (T.J.)
- Shaanxi Provincial Key Laboratory of Bioelectromagnetic Detection and Intelligent Perception, Air Force Medical University, Xi’an 710032, China
| | - Yang Zhang
- Department of Medical Electronics, School of Biomedical Engineering, Air Force Medical University, Xi’an 710032, China; (D.S.); (F.L.); (J.Q.); (Y.W.); (Y.Z.); (H.L.); (X.Y.); (T.J.)
- Shaanxi Provincial Key Laboratory of Bioelectromagnetic Detection and Intelligent Perception, Air Force Medical University, Xi’an 710032, China
| |
Collapse
|
3
|
Shi D, Gidion G, Reindl LM, Rupitsch SJ. Automatic Life Detection Based on Efficient Features of Ground-Penetrating Rescue Radar Signals. SENSORS (BASEL, SWITZERLAND) 2023; 23:6771. [PMID: 37571552 PMCID: PMC10422524 DOI: 10.3390/s23156771] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/03/2023] [Revised: 06/21/2023] [Accepted: 07/25/2023] [Indexed: 08/13/2023]
Abstract
Good feature engineering is a prerequisite for accurate classification, especially in challenging scenarios such as detecting the breathing of living persons trapped under building rubble using bioradar. Unlike monitoring patients' breathing through the air, the measuring conditions of a rescue bioradar are very complex. The ultimate goal of search and rescue is to determine the presence of a living person, which requires extracting representative features that can distinguish measurements with the presence of a person and without. To address this challenge, we conducted a bioradar test scenario under laboratory conditions and decomposed the radar signal into different range intervals to derive multiple virtual scenes from the real one. We then extracted physical and statistical quantitative features that represent a measurement, aiming to find those features that are robust to the complexity of rescue-radar measuring conditions, including different rubble sites, breathing rates, signal strengths, and short-duration disturbances. To this end, we utilized two methods, Analysis of Variance (ANOVA), and Minimum Redundancy Maximum Relevance (MRMR), to analyze the significance of the extracted features. We then trained the classification model using a linear kernel support vector machine (SVM). As the main result of this work, we identified an optimal feature set of four features based on the feature ranking and the improvement in the classification accuracy of the SVM model. These four features are related to four different physical quantities and independent from different rubble sites.
Collapse
Affiliation(s)
- Di Shi
- Laboratory for Electrical Instrumentation and Embedded Systems, Department of Microsystems Engineering—IMTEK, University of Freiburg, Georges-Köhler-Allee 106, 79110 Freiburg, Germany; (G.G.); (L.M.R.); (S.J.R.)
| | | | | | | |
Collapse
|
4
|
Qiao JH, Qi FG, Liang FL, Ma J, Lv H, Yu X, Xue HJ, An Q, Yan KD, Shi D, Qiao YH, Wang JQ, Zhang Y. Contactless multiscale measurement of cardiac motion using biomedical radar sensor. Front Cardiovasc Med 2022; 9:1057195. [PMID: 36582736 PMCID: PMC9792510 DOI: 10.3389/fcvm.2022.1057195] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/29/2022] [Accepted: 11/29/2022] [Indexed: 12/15/2022] Open
Abstract
Introduction A contactless multiscale cardiac motion measurement method is proposed using impulse radio ultra-wideband (IR-UWB) radar at a center frequency of 7.29 GHz. Motivation Electrocardiograph (ECG), heart sound, and ultrasound are traditional state-of-the-art heartbeat signal measurement methods. These methods suffer from defects in contact and the existence of a blind information segment during the cardiogram measurement. Methods Experiments and analyses were conducted using coarse-to-fine scale. Anteroposterior and along-the-arc measurements were taken from five healthy male subjects (aged 25-43) when lying down or prone. In every measurement, 10 seconds of breath-holding data were recorded with a radar 55 cm away from the body surface, while the ECG was monitored simultaneously as a reference. Results Cardiac motion detection from the front was superior to that from the back in amplitude. In terms of radar detection angles, the best cardiac motion information was observed at a detection angle of 120°. Finally, in terms of cardiac motion cycles, all the ECG information, as well as short segments of cardiac motion details named blind ECGs segments, were detected. Significance A contactless and multiscale cardiac motion detection method is proposed with no blind detection of segments during the entire cardiac cycle. This paves the way for a potentially significant method of fast and accurate cardiac disease assessment and diagnosis that exhibits promising application prospects in contactless online cardiac monitoring and in-home healthcare.
Collapse
Affiliation(s)
- Jia-hao Qiao
- Department of Military Biomedical Engineering, Fourth Military Medical University, Xi'an, China,School of Electronic Information Engineering, Xi'an Technological University, Xi'an, China
| | - Fu-gui Qi
- Department of Military Biomedical Engineering, Fourth Military Medical University, Xi'an, China,*Correspondence: Fu-gui Qi
| | - Fu-lai Liang
- Department of Military Biomedical Engineering, Fourth Military Medical University, Xi'an, China
| | - Jin Ma
- School of Aerospace Medicine, Fourth Military Medical University, Xi'an, China
| | - Hao Lv
- Department of Military Biomedical Engineering, Fourth Military Medical University, Xi'an, China
| | - Xiao Yu
- Department of Military Biomedical Engineering, Fourth Military Medical University, Xi'an, China
| | - Hui-jun Xue
- Department of Military Biomedical Engineering, Fourth Military Medical University, Xi'an, China
| | - Qiang An
- Department of Military Biomedical Engineering, Fourth Military Medical University, Xi'an, China
| | - Ke-ding Yan
- School of Electronic Information Engineering, Xi'an Technological University, Xi'an, China
| | - Ding Shi
- Department of Military Biomedical Engineering, Fourth Military Medical University, Xi'an, China,School of Electronic Information Engineering, Xi'an Technological University, Xi'an, China
| | - Yong-hui Qiao
- State Key Laboratory of Clean Energy Utilization, Zhejiang University, Hangzhou, China
| | - Jian-qi Wang
- Department of Military Biomedical Engineering, Fourth Military Medical University, Xi'an, China,Jian-qi Wang
| | - Yang Zhang
- Department of Military Biomedical Engineering, Fourth Military Medical University, Xi'an, China,Yang Zhang
| |
Collapse
|
5
|
Automatic Air-to-Ground Recognition of Outdoor Injured Human Targets Based on UAV Bimodal Information: The Explore Study. APPLIED SCIENCES-BASEL 2022. [DOI: 10.3390/app12073457] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
The rapid air-to-ground search of injured people in the outdoor environment has been a hot spot and a great challenge for public safety and emergency rescue medicine. Its crucial difficulties lie in the fact that small-scale human targets possess a low target-background contrast to the complex outdoor environment background and the human attribute of the target is hard to verify. Therefore, an automatic recognition method based on UAV bimodal information is proposed in this paper. First, suspected targets were accurately detected and separated from the background based on multispectral feature information only. Immediately after, the bio-radar module would be released and would try to detect their corresponding physiological information for accurate re-identification of the human target property. Both the suspected human target detection experiments and human target property re-identification experiments show that our proposed method could effectively realize accurate identification of ground injured in outdoor environments, which is meaningful for the research of rapid search and rescue of injured people in the outdoor environment.
Collapse
|
6
|
Abstract
Human pose reconstruction has been a fundamental research in computer vision. However, existing pose reconstruction methods suffer from the problem of wall occlusion that cannot be solved by a traditional optical sensor. This article studies a novel human target pose reconstruction framework using low-frequency ultra-wideband (UWB) multiple-input multiple-output (MIMO) radar and a convolutional neural network (CNN), which is used to detect targets behind the wall. In the proposed framework, first, we use UWB MIMO radar to capture the human body information. Then, target detection and tracking are used to lock the target position, and the back-projection algorithm is adopted to construct three-dimensional (3D) images. Finally, we take the processed 3D image as input to reconstruct the 3D pose of the human target via the designed 3D CNN model. Field detection experiments and comparison results show that the proposed framework can achieve pose reconstruction of human targets behind a wall, which indicates that our research can make up for the shortcomings of optical sensors and significantly expands the application of the UWB MIMO radar system.
Collapse
|