1
|
Rahman MM, Cook J, Taebi A. Non-contact heart vibration measurement using computer vision-based seismocardiography. Sci Rep 2023; 13:11787. [PMID: 37479720 PMCID: PMC10362031 DOI: 10.1038/s41598-023-38607-7] [Citation(s) in RCA: 1] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/19/2023] [Accepted: 07/11/2023] [Indexed: 07/23/2023] Open
Abstract
Seismocardiography (SCG) is the noninvasive measurement of local vibrations of the chest wall produced by the mechanical activity of the heart and has shown promise in providing clinical information for certain cardiovascular diseases including heart failure and ischemia. Conventionally, SCG signals are recorded by placing an accelerometer on the chest. In this paper, we propose a novel contactless SCG measurement method to extract them from chest videos recorded by a smartphone. Our pipeline consists of computer vision methods including the Lucas-Kanade template tracking to track an artificial target attached to the chest, and then estimate the SCG signals from the tracked displacements. We evaluated our pipeline on 14 healthy subjects by comparing the vision-based SCG[Formula: see text] estimations with the gold-standard SCG[Formula: see text] measured simultaneously using accelerometers attached to the chest. The similarity between SCG[Formula: see text] and SCG[Formula: see text] was measured in the time and frequency domains using the Pearson correlation coefficient, a similarity index based on dynamic time warping (DTW), and wavelet coherence. The average DTW-based similarity index between the signals was 0.94 and 0.95 in the right-to-left and head-to-foot directions, respectively. Furthermore, SCG[Formula: see text] signals were utilized to estimate the heart rate, and these results were compared to the gold-standard heart rate obtained from ECG signals. The findings indicated a good agreement between the estimated heart rate values and the gold-standard measurements (bias = 0.649 beats/min). In conclusion, this work shows promise in developing a low-cost and widely available method for remote monitoring of cardiovascular activity using smartphone videos.
Collapse
Affiliation(s)
- Mohammad Muntasir Rahman
- Department of Agricultural and Biological Engineering, Mississippi State University, Mississippi, 39762, USA
| | - Jadyn Cook
- Department of Agricultural and Biological Engineering, Mississippi State University, Mississippi, 39762, USA
| | - Amirtahà Taebi
- Department of Agricultural and Biological Engineering, Mississippi State University, Mississippi, 39762, USA.
| |
Collapse
|
2
|
Galatzer-Levy IR, Onnela JP. Machine Learning and the Digital Measurement of Psychological Health. Annu Rev Clin Psychol 2023; 19:133-154. [PMID: 37159287 DOI: 10.1146/annurev-clinpsy-080921-073212] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 05/10/2023]
Abstract
Since its inception, the discipline of psychology has utilized empirical epistemology and mathematical methodologies to infer psychological functioning from direct observation. As new challenges and technological opportunities emerge, scientists are once again challenged to define measurement paradigms for psychological health and illness that solve novel problems and capitalize on new technological opportunities. In this review, we discuss the theoretical foundations of and scientific advances in remote sensor technology and machine learning models as they are applied to quantify psychological functioning, draw clinical inferences, and chart new directions in treatment.
Collapse
Affiliation(s)
- Isaac R Galatzer-Levy
- Department of Psychiatry, New York University Grossman School of Medicine, New York, NY, USA;
- Current affiliation: Google LLC, Mountain View, California, USA
| | - Jukka-Pekka Onnela
- Department of Biostatistics, Harvard T.H. Chan School of Public Health, Boston, Massachusetts, USA
| |
Collapse
|
3
|
van Es VAA, Lopata RGP, Scilingo EP, Nardelli M. Contactless Cardiovascular Assessment by Imaging Photoplethysmography: A Comparison with Wearable Monitoring. SENSORS (BASEL, SWITZERLAND) 2023; 23:s23031505. [PMID: 36772543 PMCID: PMC9919512 DOI: 10.3390/s23031505] [Citation(s) in RCA: 4] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 12/06/2022] [Revised: 01/16/2023] [Accepted: 01/20/2023] [Indexed: 05/27/2023]
Abstract
Despite the notable recent developments in the field of remote photoplethysmography (rPPG), extracting a reliable pulse rate variability (PRV) signal still remains a challenge. In this study, eight image-based photoplethysmography (iPPG) extraction methods (GRD, AGRD, PCA, ICA, LE, SPE, CHROM, and POS) were compared in terms of pulse rate (PR) and PRV features. The algorithms were made robust for motion and illumination artifacts by using ad hoc pre- and postprocessing steps. Then, they were systematically tested on the public dataset UBFC-RPPG, containing data from 42 subjects sitting in front of a webcam (30 fps) while playing a time-sensitive mathematical game. The performances of the algorithms were evaluated by statistically comparing iPPG-based and finger-PPG-based PR and PRV features in terms of Spearman's correlation coefficient, normalized root mean square error (NRMSE), and Bland-Altman analysis. The study revealed POS and CHROM techniques to be the most robust for PR estimation and the assessment of overall autonomic nervous system (ANS) dynamics by using PRV features in time and frequency domains. Furthermore, we demonstrated that a reliable characterization of the vagal tone is made possible by computing the Poincaré map of PRV series derived from the POS and CHROM methods. This study supports the use of iPPG systems as promising tools to obtain clinically useful and specific information about ANS dynamics.
Collapse
Affiliation(s)
- Valerie A. A. van Es
- Department of Biomedical Engineering, University of Technology, P.O. Box 513, 5600 Eindhoven, The Netherlands
| | - Richard G. P. Lopata
- Department of Biomedical Engineering, University of Technology, P.O. Box 513, 5600 Eindhoven, The Netherlands
| | - Enzo Pasquale Scilingo
- Bioengineering and Robotics Research Centre E. Piaggio, Dipartimento di Ingegneria dell’Informazione, University of Pisa, Largo Lucio Lazzarino 1, 56122 Pisa, Italy
| | - Mimma Nardelli
- Bioengineering and Robotics Research Centre E. Piaggio, Dipartimento di Ingegneria dell’Informazione, University of Pisa, Largo Lucio Lazzarino 1, 56122 Pisa, Italy
| |
Collapse
|
4
|
Jaiswal KB, Meenpal T. rPPG-FuseNet: Non-contact heart rate estimation from facial video via RGB/MSR signal fusion. Biomed Signal Process Control 2022. [DOI: 10.1016/j.bspc.2022.104002] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022]
|
5
|
Continuous Monitoring of Vital Signs Using Cameras: A Systematic Review. SENSORS 2022; 22:s22114097. [PMID: 35684717 PMCID: PMC9185528 DOI: 10.3390/s22114097] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/06/2022] [Revised: 05/18/2022] [Accepted: 05/18/2022] [Indexed: 02/04/2023]
Abstract
In recent years, noncontact measurements of vital signs using cameras received a great amount of interest. However, some questions are unanswered: (i) Which vital sign is monitored using what type of camera? (ii) What is the performance and which factors affect it? (iii) Which health issues are addressed by camera-based techniques? Following the preferred reporting items for systematic reviews and meta-analyses (PRISMA) statement, we conduct a systematic review of continuous camera-based vital sign monitoring using Scopus, PubMed, and the Association for Computing Machinery (ACM) databases. We consider articles that were published between January 2018 and April 2021 in the English language. We include five vital signs: heart rate (HR), respiratory rate (RR), blood pressure (BP), body skin temperature (BST), and oxygen saturation (SpO2). In total, we retrieve 905 articles and screened them regarding title, abstract, and full text. One hundred and four articles remained: 60, 20, 6, 2, and 1 of the articles focus on HR, RR, BP, BST, and SpO2, respectively, and 15 on multiple vital signs. HR and RR can be measured using red, green, and blue (RGB) and near-infrared (NIR) as well as far-infrared (FIR) cameras. So far, BP and SpO2 are monitored with RGB cameras only, whereas BST is derived from FIR cameras only. Under ideal conditions, the root mean squared error is around 2.60 bpm, 2.22 cpm, 6.91 mm Hg, 4.88 mm Hg, and 0.86 °C for HR, RR, systolic BP, diastolic BP, and BST, respectively. The estimated error for SpO2 is less than 1%, but it increases with movements of the subject and the camera-subject distance. Camera-based remote monitoring mainly explores intensive care, post-anaesthesia care, and sleep monitoring, but also explores special diseases such as heart failure. The monitored targets are newborn and pediatric patients, geriatric patients, athletes (e.g., exercising, cycling), and vehicle drivers. Camera-based techniques monitor HR, RR, and BST in static conditions within acceptable ranges for certain applications. The research gaps are large and heterogeneous populations, real-time scenarios, moving subjects, and accuracy of BP and SpO2 monitoring.
Collapse
|
6
|
Ni A, Azarang A, Kehtarnavaz N. A Review of Deep Learning-Based Contactless Heart Rate Measurement Methods. SENSORS 2021; 21:s21113719. [PMID: 34071736 PMCID: PMC8198867 DOI: 10.3390/s21113719] [Citation(s) in RCA: 23] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/29/2021] [Revised: 05/18/2021] [Accepted: 05/24/2021] [Indexed: 02/07/2023]
Abstract
The interest in contactless or remote heart rate measurement has been steadily growing in healthcare and sports applications. Contactless methods involve the utilization of a video camera and image processing algorithms. Recently, deep learning methods have been used to improve the performance of conventional contactless methods for heart rate measurement. After providing a review of the related literature, a comparison of the deep learning methods whose codes are publicly available is conducted in this paper. The public domain UBFC dataset is used to compare the performance of these deep learning methods for heart rate measurement. The results obtained show that the deep learning method PhysNet generates the best heart rate measurement outcome among these methods, with a mean absolute error value of 2.57 beats per minute and a mean square error value of 7.56 beats per minute.
Collapse
|
7
|
Shoushan MM, Reyes BA, Rodriguez AM, Chong JW. Non-Contact HR Monitoring via Smartphone and Webcam During Different Respiratory Maneuvers and Body Movements. IEEE J Biomed Health Inform 2021; 25:602-612. [PMID: 32750916 DOI: 10.1109/jbhi.2020.2998399] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Abstract
As a reliable indicator for individual's healthiness conditions, heart rate (HR) has been widely considered and used. Imaging photoplethysmography (iPPG) is recently highlighted as a promising HR measurement method, due to its non-contact characteristics, by extracting the HR from facial video recordings. In this study, we propose a camera-based HR monitoring technique that estimates HR information from iPPG signals extracted from a video sequence. Videos were recorded using a smartphone or a laptop camera. We adopted the plane-orthogonal-to-skin (POS) method to compute iPPG. The proposed method is evaluated by applying it to extract HR of 9 subjects at rest and during two motion conditions (lateral and frontal) while they were performing several respiratory maneuvers-spontaneous, metronome, and forced. Automatic face detection algorithms were implemented in the proposed method. Our experimental results show that mean values of HR have 0.56% error and 99.4% accuracy when compared to HR calculated from the gold-standard electrocardiography (ECG) reference in diverse conditions of motions and respiratory maneuvers.
Collapse
|
8
|
Hou J, Zhang Y, Zhang S, Geng X, Zhang J, Chen C, Zhang H. A novel angle extremum maximum method for recognition of pulse wave feature points. COMPUTER METHODS AND PROGRAMS IN BIOMEDICINE 2020; 189:105321. [PMID: 31986472 DOI: 10.1016/j.cmpb.2020.105321] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/22/2019] [Revised: 01/05/2020] [Accepted: 01/06/2020] [Indexed: 06/10/2023]
Abstract
BACKGROUND AND OBJECTIVES Pulse wave is one of the biomedical signals that has been studied over the past years. Accurate recognition of feature points is the basis of verifying the connections between pulse waves and certain diseases. Therefore, the aim of the study is to discuss the use of angle mapping on feature points recognition. METHODS The mathematical method is based on the application of angle curve with parameter " k " on pulse wave. The data used is collected by PVDF sensor. Approximate curve and mathematical model are used for the discussion of the influence of parameter k and pulse wave amplitude by numerical calculation. The conclusion drawn from the numerical solution is that when k changes to maximize the angle extremum value, the corresponding position of angle extremum point is the feature point position. For the sampling rate f = 455Hz in this paper, k can be taken from 5 to 15. RESULTS We present the recognition results of unobvious feature points based on the "angle extremum maximum method" and corresponding angle values. The results are compared with traditional methods and the determination of angle threshold value is discussed. CONCLUSIONS This method can be used for accurate and efficient feature points identification, and it can be better applied to pulse waves with noise or unobvious feature points.
Collapse
Affiliation(s)
- Jiena Hou
- Institute of Microelectronics of Chinese Academy of Sciences, No. 3 Beitucheng West Road, Chaoyang District, Beijing 100029, China; University of Chinese Academy of Sciences, China; Beijing Key Laboratory for Next Generation RF Communication Chip Technology, China
| | - Yitao Zhang
- Institute of Microelectronics of Chinese Academy of Sciences, No. 3 Beitucheng West Road, Chaoyang District, Beijing 100029, China; Beijing Key Laboratory for Next Generation RF Communication Chip Technology, China
| | - Shaolong Zhang
- Institute of Microelectronics of Chinese Academy of Sciences, No. 3 Beitucheng West Road, Chaoyang District, Beijing 100029, China; Beijing Key Laboratory for Next Generation RF Communication Chip Technology, China
| | - Xingguang Geng
- Institute of Microelectronics of Chinese Academy of Sciences, No. 3 Beitucheng West Road, Chaoyang District, Beijing 100029, China; University of Chinese Academy of Sciences, China; Beijing Key Laboratory for Next Generation RF Communication Chip Technology, China
| | - Jun Zhang
- Institute of Microelectronics of Chinese Academy of Sciences, No. 3 Beitucheng West Road, Chaoyang District, Beijing 100029, China; University of Chinese Academy of Sciences, China; Beijing Key Laboratory for Next Generation RF Communication Chip Technology, China
| | - Chuanglu Chen
- Institute of Microelectronics of Chinese Academy of Sciences, No. 3 Beitucheng West Road, Chaoyang District, Beijing 100029, China; University of Chinese Academy of Sciences, China; Beijing Key Laboratory for Next Generation RF Communication Chip Technology, China
| | - Haiying Zhang
- Institute of Microelectronics of Chinese Academy of Sciences, No. 3 Beitucheng West Road, Chaoyang District, Beijing 100029, China; University of Chinese Academy of Sciences, China; Beijing Key Laboratory for Next Generation RF Communication Chip Technology, China.
| |
Collapse
|
9
|
Jorquera-Chavez M, Fuentes S, Dunshea FR, Warner RD, Poblete T, Morrison RS, Jongman EC. Remotely Sensed Imagery for Early Detection of Respiratory Disease in Pigs: A Pilot Study. Animals (Basel) 2020; 10:E451. [PMID: 32182745 PMCID: PMC7142473 DOI: 10.3390/ani10030451] [Citation(s) in RCA: 12] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/15/2020] [Revised: 03/06/2020] [Accepted: 03/06/2020] [Indexed: 12/17/2022] Open
Abstract
Respiratory diseases are a major problem in the pig industry worldwide. Due to the impact of these diseases, the early identification of infected herds is essential. Computer vision technology, using RGB (red, green and blue) and thermal infrared imagery, can assist the early detection of changes in animal physiology related to these and other diseases. This pilot study aimed to identify whether these techniques are a useful tool to detect early changes of eye and ear-base temperature, heart rate and respiration rate in pigs that were challenged with Actinobacillus pleuropneumoniae. Clinical observations and imagery were analysed, comparing data obtained from animals that showed some signs of illness with data from animals that showed no signs of ill health. Highly significant differences (p < 0.05) were observed between sick and healthy pigs in heart rate, eye and ear temperature, with higher heart rate and higher temperatures in sick pigs. The largest change in temperature and heart rate remotely measured was observed around 4-6 h before signs of clinical illness were observed by the skilled technicians. These data suggest that computer vision techniques could be a useful tool to detect indicators of disease before the symptoms can be observed by stock people, assisting the early detection and control of respiratory diseases in pigs, promoting further research to study the capability and possible uses of this technology for on farm monitoring and management.
Collapse
Affiliation(s)
- Maria Jorquera-Chavez
- Faculty of Veterinary and Agricultural Sciences, University of Melbourne, VIC 3010, Australia; (S.F.); (F.R.D.); (R.D.W.); (T.P.)
| | - Sigfredo Fuentes
- Faculty of Veterinary and Agricultural Sciences, University of Melbourne, VIC 3010, Australia; (S.F.); (F.R.D.); (R.D.W.); (T.P.)
| | - Frank R. Dunshea
- Faculty of Veterinary and Agricultural Sciences, University of Melbourne, VIC 3010, Australia; (S.F.); (F.R.D.); (R.D.W.); (T.P.)
| | - Robyn D. Warner
- Faculty of Veterinary and Agricultural Sciences, University of Melbourne, VIC 3010, Australia; (S.F.); (F.R.D.); (R.D.W.); (T.P.)
| | - Tomas Poblete
- Faculty of Veterinary and Agricultural Sciences, University of Melbourne, VIC 3010, Australia; (S.F.); (F.R.D.); (R.D.W.); (T.P.)
| | - Rebecca S. Morrison
- Research and Innovation, Rivalea (Australia) Pty. Ltd., Corowa, NSW 2646, Australia;
| | - Ellen C. Jongman
- Animal Welfare Science Centre, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia;
| |
Collapse
|
10
|
Maki Y, Monno Y, Yoshizaki K, Tanaka M, Okutomi M. Inter-Beat Interval Estimation from Facial Video Based on Reliability of BVP Signals. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2020; 2019:6525-6528. [PMID: 31947336 DOI: 10.1109/embc.2019.8857081] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Inter-beat interval (IBI) and heart rate variability (HRV) are important cardiac parameters that provide physiological and emotional states of a person. In this paper, we present a framework for accurate IBI and HRV estimation from a facial video based on the reliability of extracted blood volume pulse (BVP) signals. Our framework first extracts candidate BVP signals from randomly sampled multiple face patches. The BVP signals are then assessed based on a reliability metric to select the most reliable BVP signal, from which IBI and HRV are calculated. In experiments, we evaluate three reliability metrics and demonstrate that our framework can estimate IBI and HRV more accurately than a conventional single face region-based framework.
Collapse
|
11
|
Modelling and Validation of Computer Vision Techniques to Assess Heart Rate, Eye Temperature, Ear-Base Temperature and Respiration Rate in Cattle. Animals (Basel) 2019; 9:ani9121089. [PMID: 31817620 PMCID: PMC6940919 DOI: 10.3390/ani9121089] [Citation(s) in RCA: 33] [Impact Index Per Article: 6.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/05/2019] [Revised: 11/29/2019] [Accepted: 12/04/2019] [Indexed: 02/06/2023] Open
Abstract
Simple Summary Animal monitoring normally requires procedures that are time- and labour-consuming. The implementation of novel non-invasive technologies could be a good approach to monitor animal health and welfare. This study aimed to evaluate the use of images and computer-based methods to track specific features of the face and to assess temperature; respiration rate and heart rate in cattle. The measurements were compared with measures obtained with conventional methods during the same time period. The data were collected from ten dairy cows that were recorded during six handling procedures across two consecutive days. The results from this study show over 92% of accuracy from the computer algorithm that was developed to track the areas selected on the videos collected. In addition, acceptable correlation was observed between the temperature calculated from thermal infrared images and temperature collected using intravaginal loggers. Moreover, there was acceptable correlation between the respiration rate calculated from infrared videos and from visual observation. Furthermore, a low to high relationship was found between the heart rate obtained from videos and from attached monitors. The study also showed that both the position of the cameras and the area analysed on the images are very important, as both had large impact on the accuracy of the methods. The positive outcomes and the limitations observed in this study suggest the need for further research Abstract Precision livestock farming has emerged with the aim of providing detailed information to detect and reduce problems related to animal management. This study aimed to develop and validate computer vision techniques to track required features of cattle face and to remotely assess eye temperature, ear-base temperature, respiration rate, and heart rate in cattle. Ten dairy cows were recorded during six handling procedures across two consecutive days using thermal infrared cameras and RGB (red, green, blue) video cameras. Simultaneously, core body temperature, respiration rate and heart rate were measured using more conventional ‘invasive’ methods to be compared with the data obtained with the proposed algorithms. The feature tracking algorithm, developed to improve image processing, showed an accuracy between 92% and 95% when tracking different areas of the face of cows. The results of this study also show correlation coefficients up to 0.99 between temperature measures obtained invasively and those obtained remotely, with the highest values achieved when the analysis was performed within individual cows. In the case of respiration rate, a positive correlation (r = 0.87) was found between visual observations and the analysis of non-radiometric infrared videos. Low to high correlation coefficients were found between the heart rates (0.09–0.99) obtained from attached monitors and from the proposed method. Furthermore, camera location and the area analysed appear to have a relevant impact on the performance of the proposed techniques. This study shows positive outcomes from the proposed computer vision techniques when measuring physiological parameters. Further research is needed to automate and improve these techniques to measure physiological changes in farm animals considering their individual characteristics.
Collapse
|
12
|
Remote Monitoring of Vital Signs in Diverse Non-Clinical and Clinical Scenarios Using Computer Vision Systems: A Review. APPLIED SCIENCES-BASEL 2019. [DOI: 10.3390/app9204474] [Citation(s) in RCA: 24] [Impact Index Per Article: 4.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Techniques for noncontact measurement of vital signs using camera imaging technologies have been attracting increasing attention. For noncontact physiological assessments, computer vision-based methods appear to be an advantageous approach that could be robust, hygienic, reliable, safe, cost effective and suitable for long distance and long-term monitoring. In addition, video techniques allow measurements from multiple individuals opportunistically and simultaneously in groups. This paper aims to explore the progress of the technology from controlled clinical scenarios with fixed monitoring installations and controlled lighting, towards uncontrolled environments, crowds and moving sensor platforms. We focus on the diversity of applications and scenarios being studied in this topic. From this review it emerges that automatic multiple regions of interest (ROIs) selection, removal of noise artefacts caused by both illumination variations and motion artefacts, simultaneous multiple person monitoring, long distance detection, multi-camera fusion and accepted publicly available datasets are topics that still require research to enable the technology to mature into many real-world applications.
Collapse
|
13
|
Antink CH, Lyra S, Paul M, Yu X, Leonhardt S. A Broader Look: Camera-Based Vital Sign Estimation across the Spectrum. Yearb Med Inform 2019; 28:102-114. [PMID: 31419822 PMCID: PMC6697643 DOI: 10.1055/s-0039-1677914] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 12/22/2022] Open
Abstract
OBJECTIVES Camera-based vital sign estimation allows the contactless assessment of important physiological parameters. Seminal contributions were made in the 1930s, 1980s, and 2000s, and the speed of development seems ever increasing. In this suivey, we aim to overview the most recent works in this area, describe their common features as well as shortcomings, and highlight interesting "outliers". METHODS We performed a comprehensive literature research and quantitative analysis of papers published between 2016 and 2018. Quantitative information about the number of subjects, studies with healthy volunteers vs. pathological conditions, public datasets, laboratory vs. real-world works, types of camera, usage of machine learning, and spectral properties of data was extracted. Moreover, a qualitative analysis of illumination used and recent advantages in terms of algorithmic developments was also performed. RESULTS Since 2016, 116 papers were published on camera-based vital sign estimation and 59% of papers presented results on 20 or fewer subjects. While the average number of participants increased from 15.7 in 2016 to 22.9 in 2018, the vast majority of papers (n=100) were on healthy subjects. Four public datasets were used in 10 publications. We found 27 papers whose application scenario could be considered a real-world use case, such as monitoring during exercise or driving. These include 16 papers that dealt with non-healthy subjects. The majority of papers (n=61) presented results based on visual, red-green-blue (RGB) information, followed by RGB combined with other parts of the electromagnetic spectrum (n=18), and thermography only (n=12), while other works (n=25) used other mono- or polychromatic non-RGB data. Surprisingly, a minority of publications (n=39) made use of consumer-grade equipment. Lighting conditions were primarily uncontrolled or ambient. While some works focused on specialized aspects such as the removal of vital sign information from video streams to protect privacy or the influence of video compression, most algorithmic developments were related to three areas: region of interest selection, tracking, or extraction of a one-dimensional signal. Seven papers used deep learning techniques, 17 papers used other machine learning approaches, and 92 made no explicit use of machine learning. CONCLUSION Although some general trends and frequent shortcomings are obvious, the spectrum of publications related to camera-based vital sign estimation is broad. While many creative solutions and unique approaches exist, the lack of standardization hinders comparability of these techniques and of their performance. We believe that sharing algorithms and/ or datasets will alleviate this and would allow the application of newer techniques such as deep learning.
Collapse
Affiliation(s)
- Christoph Hoog Antink
- Medical Information Technology (MedIT), Helmholtz-Institute for Biomedical Engineering, RWTH Aachen University, Aachen, Germany
| | | | - Michael Paul
- Medical Information Technology (MedIT), Helmholtz-Institute for Biomedical Engineering, RWTH Aachen University, Aachen, Germany
| | - Xinchi Yu
- Medical Information Technology (MedIT), Helmholtz-Institute for Biomedical Engineering, RWTH Aachen University, Aachen, Germany
| | - Steffen Leonhardt
- Medical Information Technology (MedIT), Helmholtz-Institute for Biomedical Engineering, RWTH Aachen University, Aachen, Germany
| |
Collapse
|
14
|
Abstract
Newtonian reaction to blood influx into the head at each heartbeat causes subtle head motion at the same frequency as the heartbeats. Thus, this head motion can be used to estimate the heart rate. Several studies have shown that heart rates can be measured accurately by tracking head motion using a desktop computer with a static camera. However, implementation of vision-based head motion tracking on smartphones demonstrated limited accuracy due to the hand-shaking problem caused by the non-static camera. The hand-shaking problem could not be handled effectively with only the frontal camera images. It also required a more accurate method to measure the periodicity of noisy signals. Therefore, this study proposes an improved head-motion-based heart-rate monitoring system using smartphones. To address the hand-shaking problem, the proposed system leverages the front and rear cameras available in most smartphones and dedicates each camera to tracking facial features that correspond to head motion and background features that correspond to hand-shaking. Then, the locations of facial features are adjusted using the average point of the background features. In addition, a correlation-based signal periodicity computation method is proposed to accurately separate the true heart-rate-related component from the head motion signal. The proposed system demonstrates improved accuracy (i.e., lower mean errors in heart-rate measurement) compared to conventional head-motion-based systems, and the accuracy is sufficient for daily heart-rate monitoring.
Collapse
|
15
|
Zaunseder S, Trumpp A, Wedekind D, Malberg H. Cardiovascular assessment by imaging photoplethysmography - a review. ACTA ACUST UNITED AC 2019; 63:617-634. [PMID: 29897880 DOI: 10.1515/bmt-2017-0119] [Citation(s) in RCA: 43] [Impact Index Per Article: 8.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/13/2017] [Accepted: 05/04/2018] [Indexed: 12/12/2022]
Abstract
Over the last few years, the contactless acquisition of cardiovascular parameters using cameras has gained immense attention. The technique provides an optical means to acquire cardiovascular information in a very convenient way. This review provides an overview on the technique's background and current realizations. Besides giving detailed information on the most widespread application of the technique, namely the contactless acquisition of heart rate, we outline further concepts and we critically discuss the current state.
Collapse
Affiliation(s)
- Sebastian Zaunseder
- TU Dresden, Institute of Biomedical Engineering, Helmholtzstraße 18, Dresden, 01069 Saxony, Germany
| | - Alexander Trumpp
- TU Dresden, Institute of Biomedical Engineering, Helmholtzstraße 18, Dresden, 01069 Saxony, Germany
| | - Daniel Wedekind
- TU Dresden, Institute of Biomedical Engineering, Helmholtzstraße 18, Dresden, 01069 Saxony, Germany
| | - Hagen Malberg
- TU Dresden, Institute of Biomedical Engineering, Helmholtzstraße 18, Dresden, 01069 Saxony, Germany
| |
Collapse
|
16
|
Jorquera-Chavez M, Fuentes S, Dunshea FR, Jongman EC, Warner RD. Computer vision and remote sensing to assess physiological responses of cattle to pre-slaughter stress, and its impact on beef quality: A review. Meat Sci 2019; 156:11-22. [PMID: 31121361 DOI: 10.1016/j.meatsci.2019.05.007] [Citation(s) in RCA: 14] [Impact Index Per Article: 2.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/24/2019] [Revised: 05/04/2019] [Accepted: 05/06/2019] [Indexed: 10/26/2022]
Abstract
Pre-slaughter stress is well-known to affect meat quality of beef carcasses and methods have been developed to assess this stress. However, development of more practical and less invasive methods are required in order to assess the response of cattle to pre-slaughter stressors, which will potentially also assist with the prediction of beef quality. This review outlines the importance of pre-slaughter stress as well as existing and emerging technologies for quantification of the pre-slaughter stress. The review includes; i) indicators of meat quality and how they are affected by pre-slaughter stress in cattle, ii) contact techniques that have been commonly used to measure stress indicators in animals, iii) remotely sensed imagery techniques recently used as non-invasive methods to monitor physiological and behavioural parameters and iv) potential implementation of remotely sensed imagery data to perform contactless assessment of physiological measurements, which could be related to the pre-slaughter stress, as well as to the indicators of beef quality. Relevance to industry, conclusions and recommendations for research are included.
Collapse
Affiliation(s)
- Maria Jorquera-Chavez
- School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia.
| | - Sigfredo Fuentes
- School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia
| | - Frank R Dunshea
- School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia
| | - Ellen C Jongman
- Animal Welfare Science Centre, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia
| | - Robyn D Warner
- School of Agriculture and Food, Faculty of Veterinary and Agricultural Sciences, University of Melbourne, Parkville, VIC 3010, Australia
| |
Collapse
|
17
|
Finžgar M, Podržaj P. A wavelet-based decomposition method for a robust extraction of pulse rate from video recordings. PeerJ 2018; 6:e5859. [PMID: 30519506 PMCID: PMC6267003 DOI: 10.7717/peerj.5859] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/25/2018] [Accepted: 10/02/2018] [Indexed: 11/20/2022] Open
Abstract
Background Remote photoplethysmography (rPPG) is a promising optical method for non-contact assessment of pulse rate (PR) from video recordings. In order to implement the method in real-time applications, it is necessary for the rPPG algorithms to be capable of eliminating as many distortions from the pulse signal as possible. Methods In order to increase the degrees-of-freedom of the distortion elimination, the dimensionality of the RGB video signals is increased by the wavelet transform decomposition using the generalized Morse wavelet. The proposed Continuous-Wavelet-Transform-based Sub-Band rPPG method (SB-CWT) is evaluated on the 101 publicly available RGB facial video recordings and corresponding reference blood volume pulse (BVP) signals taken from the MMSE-HR database. The performance of the SB-CWT is compared with the performance of the state-of-the-art Sub-band rPPG (SB). Results Median signal-to-noise ratio (SNR) for the proposed SB-CWT ranges from 6.63 to 10.39 dB and for the SB from 4.23 to 6.24 dB. The agreement between the estimated PRs from rPPG pulse signals and the reference signals in terms of the coefficients of determination ranges from 0.81 to 0.91 for SB-CWT and from 0.41 to 0.47 for SB. All the correlation coefficients are statistically significant (p < 0.001). The Bland-Altman plots show that mean difference range from 5.37 to 1.82 BPM for SB-CWT and from 22.18 to 18.80 BPM for SB. Discussion The results show that the proposed SB-CWT outperforms SB in terms of SNR and the agreement between the estimated PRs from RGB video signals and PRs from the reference BVP signals.
Collapse
Affiliation(s)
- Miha Finžgar
- Faculty of Mechanical Engineering, University of Ljubljana, Ljubljana, Slovenia
| | - Primož Podržaj
- Faculty of Mechanical Engineering, University of Ljubljana, Ljubljana, Slovenia
| |
Collapse
|
18
|
Moya-Albor E, Brieva J, Ponce H, Rivas-Scott O, Gomez-Pena C. Heart Rate Estimation using Hermite Transform Video Magnification and Deep Learning .. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2018; 2018:2595-2598. [PMID: 30440939 DOI: 10.1109/embc.2018.8512879] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
Abstract
Monitoring of heart rate can be used in many medical and sports applications. Lack of portability and connection problems make traditional monitoring methods difficult to use outside of clinical environments. The computer vision techniques have been shown that some physiological variables as heart rate can be measured without contact. Video magnification is one of these approach used for the detection of the pulse signal. In this paper we propose a new strategy to magnify motion in a video sequence using the Hermite transform. In addition a deep learning technique is implemented to estimate the beat by beat pulse signal. We trained the system and validated our results using an electronic pulse monitoring device. Our approach is compared with the classical video magnification using a Gaussian pyramid. The results show a better enhancement of spectral information from the colour changes allowing an accurate estimation of the instantaneous beat by beat pulse than the Gaussian approach.
Collapse
|
19
|
Kado S, Monno Y, Moriwaki K, Yoshizaki K, Tanaka M, Okutomi M. Remote Heart Rate Measurement from RGB-NIR Video Based on Spatial and Spectral Face Patch Selection. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2018; 2018:5676-5680. [PMID: 30441624 DOI: 10.1109/embc.2018.8513464] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.8] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
In this paper, we propose a novel heart rate (HR) estimation method using simultaneously recorded RGB and near-infrared (NIR) face videos. The key idea of our method is to automatically select suitable face patches for HR estimation in both spatial and spectral domains. The spatial and spectral face patch selection enables us to robustly estimate HR under various situations, including scenes under which existing RGB camera-based methods fail to accurately estimate HR. For a challenging scene in low light and with light fluctuations, our method can successfully estimate HR for all 20 subjects $( \pm 3$ beats per minute), while the RGB camera-based methods succeed only for 25% of the subjects.
Collapse
|
20
|
Yazdani S, Fallet S, Vesin JM. A Novel Short-Term Event Extraction Algorithm for Biomedical Signals. IEEE Trans Biomed Eng 2018. [DOI: 10.1109/tbme.2017.2718179] [Citation(s) in RCA: 33] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
21
|
Prakash SKA, Tucker CS. Bounded Kalman filter method for motion-robust, non-contact heart rate estimation. BIOMEDICAL OPTICS EXPRESS 2018; 9:873-897. [PMID: 29552419 PMCID: PMC5854085 DOI: 10.1364/boe.9.000873] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/02/2017] [Revised: 01/19/2018] [Accepted: 01/19/2018] [Indexed: 06/08/2023]
Abstract
The authors of this work present a real-time measurement of heart rate across different lighting conditions and motion categories. This is an advancement over existing remote Photo Plethysmography (rPPG) methods that require a static, controlled environment for heart rate detection, making them impractical for real-world scenarios wherein a patient may be in motion, or remotely connected to a healthcare provider through telehealth technologies. The algorithm aims to minimize motion artifacts such as blurring and noise due to head movements (uniform, random) by employing i) a blur identification and denoising algorithm for each frame and ii) a bounded Kalman filter technique for motion estimation and feature tracking. A case study is presented that demonstrates the feasibility of the algorithm in non-contact estimation of the pulse rate of subjects performing everyday head and body movements. The method in this paper outperforms state of the art rPPG methods in heart rate detection, as revealed by the benchmarked results.
Collapse
Affiliation(s)
- Sakthi Kumar Arul Prakash
- Department of Industrial and Manufacturing Engineering, Pennsylvania State University, State College, Pennsylvania 16801, USA
| | - Conrad S. Tucker
- School of Engineering Design, Technology and Professional Programs (SEDTAPP), Department of Industrial and Manufacturing Engineering, Pennsylvania State University, State College, Pennsylvania 16801, USA
| |
Collapse
|
22
|
Abstract
Embedded systems control and monitor a great deal of our reality. While some "classic" features are intrinsically necessary, such as low power consumption, rugged operating ranges, fast response and low cost, these systems have evolved in the last few years to emphasize connectivity functions, thus contributing to the Internet of Things paradigm. A myriad of sensing/computing devices are being attached to everyday objects, each able to send and receive data and to act as a unique node in the Internet. Apart from the obvious necessity to process at least some data at the edge (to increase security and reduce power consumption and latency), a major breakthrough will arguably come when such devices are endowed with some level of autonomous "intelligence". Intelligent computing aims to solve problems for which no efficient exact algorithm can exist or for which we cannot conceive an exact algorithm. Central to such intelligence is Computer Vision (CV), i.e., extracting meaning from images and video. While not everything needs CV, visual information is the richest source of information about the real world: people, places and things. The possibilities of embedded CV are endless if we consider new applications and technologies, such as deep learning, drones, home robotics, intelligent surveillance, intelligent toys, wearable cameras, etc. This paper describes the Eyes of Things (EoT) platform, a versatile computer vision platform tackling those challenges and opportunities.
Collapse
|
23
|
Wang W, den Brinker AC, Stuijk S, de Haan G. Robust heart rate from fitness videos. Physiol Meas 2017; 38:1023-1044. [DOI: 10.1088/1361-6579/aa6d02] [Citation(s) in RCA: 70] [Impact Index Per Article: 10.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
24
|
van Gastel M, Stuijk S, de Haan G. Robust respiration detection from remote photoplethysmography. BIOMEDICAL OPTICS EXPRESS 2016; 7:4941-4957. [PMID: 28018717 PMCID: PMC5175543 DOI: 10.1364/boe.7.004941] [Citation(s) in RCA: 44] [Impact Index Per Article: 5.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/15/2016] [Revised: 09/30/2016] [Accepted: 10/06/2016] [Indexed: 05/10/2023]
Abstract
Continuous monitoring of respiration is essential for early detection of critical illness. Current methods require sensors attached to the body and/or are not robust to subject motion. Alternative camera-based solutions have been presented using motion vectors and remote photoplethysmography. In this work, we present a non-contact camera-based method to detect respiration, which can operate in both visible and dark lighting conditions by detecting the respiratory-induced colour differences of the skin. We make use of the close similarity between skin colour variations caused by the beating of the heart and those caused by respiration, leading to a much improved signal quality compared to single-channel approaches. Essentially, we propose to find the linear combination of colour channels which suppresses the distortions best in a frequency band including pulse rate, and subsequently we use this same linear combination to extract the respiratory signal in a lower frequency band. Evaluation results obtained from recordings on healthy subjects which perform challenging scenarios, including motion, show that respiration can be accurately detected over the entire range of respiratory frequencies, with a correlation coefficient of 0.96 in visible light and 0.98 in infrared, compared to 0.86 with the best-performing non-contact benchmark algorithm. Furthermore, evaluation on a set of videos recorded in a Neonatal Intensive Care Unit (NICU) shows that this technique looks promising as a future alternative to current contact-sensors showing a correlation coefficient of 0.87.
Collapse
Affiliation(s)
- Mark van Gastel
- Department of Electrical Engineering, Eindhoven University of Technology, PO Box 513, 5600MB, Eindhoven, The
Netherlands
| | - Sander Stuijk
- Department of Electrical Engineering, Eindhoven University of Technology, PO Box 513, 5600MB, Eindhoven, The
Netherlands
| | - Gerard de Haan
- Department of Electrical Engineering, Eindhoven University of Technology, PO Box 513, 5600MB, Eindhoven, The
Netherlands
- Philips Research, High Tech Campus 36, 5656AE, Eindhoven, The
Netherlands
| |
Collapse
|