1
|
Valentim AF, Motta AR, Silva JAS, Furlan RMMM, Porto MP, Becker HMG, Franco LP, Gama ACC. Comparison of infrared thermography of the face between mouth-breathing and nasal-breathing children. Eur Arch Otorhinolaryngol 2024:10.1007/s00405-024-09038-5. [PMID: 39433570 DOI: 10.1007/s00405-024-09038-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/05/2024] [Accepted: 10/11/2024] [Indexed: 10/23/2024]
Abstract
PURPOSE To compare the temperature of thermoanatomic points and areas of the upper and lower lips between mouth-breathing and nasal-breathing children. METHODS This cross-sectional observational study had a sample of 30 nasal-breathing and 30 mouth-breathing children aged 4 to 11 years. One front-view, one left-view, and one right-view infrared thermogram of the face were acquired from each participant. A total of 14 thermoanatomic points plus the upper lip and lower lip areas were marked on the front-view thermograms, while on the side-view thermograms, six thermoanatomic points were marked. The research also calculated the difference between the temperature of the upper and lower lip areas (∆T area) and between the temperature of the points on the upper and lower lips (∆T points). The normalized mean temperatures of points and areas and temperature differences were compared between groups with the t-test and Mann-Whitney test. RESULTS The temperature of the thermoanatomic points closest to the lip (nasolabial, Labial Commissure, and lower labial), areas of the lips, and external acoustic meatus was lower in mouth breathers than in nasal breathers, which did not happen for most other points. ∆T area and ∆T points were not different between the groups. CONCLUSION Thermography is a promising complementary diagnostic tool, since showed mouth-breathing children had lower temperatures in the region of the lips than nasal breathers.
Collapse
Affiliation(s)
- Amanda Freitas Valentim
- Children and Adolescent Health at the Federal, University of Minas Gerais - UFMG, Belo Horizonte, Minas Gerais, Brazil.
| | - Andréa Rodrigues Motta
- Department of Speech-Language-Hearing Sciences, Federal University of Minas Gerais - UFMG, Belo Horizonte, Minas Gerais, Brazil
| | | | | | - Matheus Pereira Porto
- Department of Mechanical Engineering, Federal University of Minas Gerais, UFMG, Belo Horizonte, Minas Gerais, Brazil
| | | | - Letícia Paiva Franco
- Department of Otorhinolaryngology, Federal University of Minas Gerais - UFMG, Belo Horizonte, Minas Gerais, Brazil
| | - Ana Cristina Côrtes Gama
- Department of Speech-Language-Hearing Sciences, Federal University of Minas Gerais - UFMG, Belo Horizonte, Minas Gerais, Brazil
| |
Collapse
|
2
|
Liu I, Liu F, Zhong Q, Ma F, Ni S. Your blush gives you away: detecting hidden mental states with remote photoplethysmography and thermal imaging. PeerJ Comput Sci 2024; 10:e1912. [PMID: 38660202 PMCID: PMC11041963 DOI: 10.7717/peerj-cs.1912] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/30/2023] [Accepted: 02/05/2024] [Indexed: 04/26/2024]
Abstract
Multimodal emotion recognition techniques are increasingly essential for assessing mental states. Image-based methods, however, tend to focus predominantly on overt visual cues and often overlook subtler mental state changes. Psychophysiological research has demonstrated that heart rate (HR) and skin temperature are effective in detecting autonomic nervous system (ANS) activities, thereby revealing these subtle changes. However, traditional HR tools are generally more costly and less portable, while skin temperature analysis usually necessitates extensive manual processing. Advances in remote photoplethysmography (r-PPG) and automatic thermal region of interest (ROI) detection algorithms have been developed to address these issues, yet their accuracy in practical applications remains limited. This study aims to bridge this gap by integrating r-PPG with thermal imaging to enhance prediction performance. Ninety participants completed a 20-min questionnaire to induce cognitive stress, followed by watching a film aimed at eliciting moral elevation. The results demonstrate that the combination of r-PPG and thermal imaging effectively detects emotional shifts. Using r-PPG alone, the prediction accuracy was 77% for cognitive stress and 61% for moral elevation, as determined by a support vector machine (SVM). Thermal imaging alone achieved 79% accuracy for cognitive stress and 78% for moral elevation, utilizing a random forest (RF) algorithm. An early fusion strategy of these modalities significantly improved accuracies, achieving 87% for cognitive stress and 83% for moral elevation using RF. Further analysis, which utilized statistical metrics and explainable machine learning methods including SHapley Additive exPlanations (SHAP), highlighted key features and clarified the relationship between cardiac responses and facial temperature variations. Notably, it was observed that cardiovascular features derived from r-PPG models had a more pronounced influence in data fusion, despite thermal imaging's higher predictive accuracy in unimodal analysis.
Collapse
Affiliation(s)
- Ivan Liu
- Faculty of Psychology, Beijing Normal University, Beijing, China
- Department of Psychology, Faculty of Arts and Sciences, Beijing Normal University at Zhuhai, Zhuhai, Guangdong, China
| | - Fangyuan Liu
- Department of Psychology, Faculty of Arts and Sciences, Beijing Normal University at Zhuhai, Zhuhai, Guangdong, China
| | - Qi Zhong
- Faculty of Psychology, Beijing Normal University, Beijing, China
| | - Fei Ma
- Guangdong Laboratory of Artificial Intelligence and Digital Economy (SZ), Shenzhen, Guangdong, China
| | - Shiguang Ni
- Shenzhen International Graduate School, Tsinghua University, Shenzhen, Guangdong, China
| |
Collapse
|
3
|
Ohigashi S, Sakata C, Kuroshima H, Moriguchi Y. Psychophysiological responses of shame in young children: A thermal imaging study. PLoS One 2023; 18:e0290966. [PMID: 37812601 PMCID: PMC10561869 DOI: 10.1371/journal.pone.0290966] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/08/2022] [Accepted: 08/20/2023] [Indexed: 10/11/2023] Open
Abstract
Shame can be defined as the emotional response to one's violations of rules being exposed to others. However, it is difficult to objectively measure this concept. This study examined the psychophysiological indicators of shame in young children using behavioral methods and thermography, which measures facial temperatures that reflect blood flow changes related to emotions. Four- to six-year-old children participated in an "animal guessing game," in which they lied about having violated a rule. They were assigned to either the exposure or the non-exposure group. In the exposure group, participants' lies were exposed by the experimenter, whereas in the non-exposure group, their lies were not. Results showed that at the behavioral level, participants in the exposure group expressed characteristic behaviors of shame (e.g., embarrassed smiles) more often than those in the non-exposure group. Moreover, the nasal temperatures of participants in the exposure group were higher than those of participants in the other group after the lie was exposed. These results suggest that participants' lies being exposed induced psychophysiological responses and consequently raised their nasal temperature. This finding indicates that psychophysiological responses can enable us to objectively measure higher-order emotions in young children.
Collapse
Affiliation(s)
- Sho Ohigashi
- Graduate School of Letters, Kyoto University, Yoshidahonmachi, Kyoto, Japan
| | - Chifumi Sakata
- Graduate School of Letters, Kyoto University, Yoshidahonmachi, Kyoto, Japan
| | - Hika Kuroshima
- Graduate School of Letters, Kyoto University, Yoshidahonmachi, Kyoto, Japan
| | - Yusuke Moriguchi
- Graduate School of Letters, Kyoto University, Yoshidahonmachi, Kyoto, Japan
| |
Collapse
|
4
|
Li L, Tang W, Yang H, Xue C. Classification of User Emotional Experiences on B2C Websites Utilizing Infrared Thermal Imaging. SENSORS (BASEL, SWITZERLAND) 2023; 23:7991. [PMID: 37766045 PMCID: PMC10534612 DOI: 10.3390/s23187991] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 08/14/2023] [Revised: 09/18/2023] [Accepted: 09/18/2023] [Indexed: 09/29/2023]
Abstract
The acquisition of physiological signals for analyzing emotional experiences has been intrusive, and potentially yields inaccurate results. This study employed infrared thermal images (IRTIs), a noninvasive technique, to classify user emotional experiences while interacting with business-to-consumer (B2C) websites. By manipulating the usability and aesthetics of B2C websites, the facial thermal images of 24 participants were captured as they engaged with the different websites. Machine learning techniques were leveraged to classify their emotional experiences, with participants' self-assessments serving as the ground truth. The findings revealed significant fluctuations in emotional valence, while the participants' arousal levels remained consistent, enabling the categorization of emotional experiences into positive and negative states. The support vector machine (SVM) model performed well in distinguishing between baseline and emotional experiences. Furthermore, this study identified key regions of interest (ROIs) and effective classification features in machine learning. These findings not only established a significant connection between user emotional experiences and IRTIs but also broadened the research perspective on the utility of IRTIs in the field of emotion analysis.
Collapse
Affiliation(s)
- Lanxin Li
- School of Mechanical Engineering, Southeast University, 2 Southeast University Road, Nanjing 211189, China; (L.L.); (W.T.)
| | - Wenzhe Tang
- School of Mechanical Engineering, Southeast University, 2 Southeast University Road, Nanjing 211189, China; (L.L.); (W.T.)
| | - Han Yang
- School of Instrument Science and Engineering, Southeast University, 2 Southeast University Road, Nanjing 211189, China;
| | - Chengqi Xue
- School of Mechanical Engineering, Southeast University, 2 Southeast University Road, Nanjing 211189, China; (L.L.); (W.T.)
| |
Collapse
|
5
|
Rudokaite J, Ong LLS, Onal Ertugrul I, Janssen MP, Huis In 't Veld EMJ. Predicting vasovagal reactions to needles with anticipatory facial temperature profiles. Sci Rep 2023; 13:9667. [PMID: 37316637 DOI: 10.1038/s41598-023-36207-z] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/05/2022] [Accepted: 05/31/2023] [Indexed: 06/16/2023] Open
Abstract
Around one-third of adults are scared of needles, which can result in adverse emotional and physical responses such as dizziness and fainting (e.g. vasovagal reactions; VVR) and consequently, avoidance of healthcare, treatments, and immunizations. Unfortunately, most people are not aware of vasovagal reactions until they escalate, at which time it is too late to intervene. This study aims to investigate whether facial temperature profiles measured in the waiting room, prior to a blood donation, can be used to classify who will and will not experience VVR during the donation. Average temperature profiles from six facial regions were extracted from pre-donation recordings of 193 blood donors, and machine learning was used to classify whether a donor would experience low or high levels of VVR during the donation. An XGBoost classifier was able to classify vasovagal groups from an adverse reaction during a blood donation based on this early facial temperature data, with a sensitivity of 0.87, specificity of 0.84, F1 score of 0.86, and PR-AUC of 0.93. Temperature fluctuations in the area under the nose, chin and forehead have the highest predictive value. This study is the first to demonstrate that it is possible to classify vasovagal responses during a blood donation using temperature profiles.
Collapse
Affiliation(s)
- Judita Rudokaite
- Department of Cognitive Science and Artificial Intelligence, Tilburg University, Tilburg, The Netherlands.
- Donor Medicine Research, Sanquin Research, Amsterdam, The Netherlands.
- Department of Cognitive Science & Artificial Intelligence, Tilburg University, PO Box 90153, Warandelaan 2 (Room D147), 5000 LE, Tilburg, The Netherlands.
| | - L L Sharon Ong
- Department of Cognitive Science and Artificial Intelligence, Tilburg University, Tilburg, The Netherlands
| | - Itir Onal Ertugrul
- Department of Information and Computing Sciences, Utrecht University, Utrecht, The Netherlands
| | - Mart P Janssen
- Donor Medicine Research, Sanquin Research, Amsterdam, The Netherlands
| | - Elisabeth M J Huis In 't Veld
- Department of Cognitive Science and Artificial Intelligence, Tilburg University, Tilburg, The Netherlands
- Donor Medicine Research, Sanquin Research, Amsterdam, The Netherlands
| |
Collapse
|
6
|
Al Qudah M, Mohamed A, Lutfi S. Analysis of Facial Occlusion Challenge in Thermal Images for Human Affective State Recognition. SENSORS (BASEL, SWITZERLAND) 2023; 23:3513. [PMID: 37050571 PMCID: PMC10098690 DOI: 10.3390/s23073513] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 02/13/2023] [Revised: 03/11/2023] [Accepted: 03/15/2023] [Indexed: 06/19/2023]
Abstract
Several studies have been conducted using both visual and thermal facial images to identify human affective states. Despite the advantages of thermal facial images in recognizing spontaneous human affects, few studies have focused on facial occlusion challenges in thermal images, particularly eyeglasses and facial hair occlusion. As a result, three classification models are proposed in this paper to address the problem of thermal occlusion in facial images, with six basic spontaneous emotions being classified. The first proposed model in this paper is based on six main facial regions, including the forehead, tip of the nose, cheeks, mouth, and chin. The second model deconstructs the six main facial regions into multiple subregions to investigate the efficacy of subregions in recognizing the human affective state. The third proposed model in this paper uses selected facial subregions, free of eyeglasses and facial hair (beard, mustaches). Nine statistical features on apex and onset thermal images are implemented. Furthermore, four feature selection techniques with two classification algorithms are proposed for a further investigation. According to the comparative analysis presented in this paper, the results obtained from the three proposed modalities were promising and comparable to those of other studies.
Collapse
Affiliation(s)
- Mustafa Al Qudah
- School of Computer Sciences, Universiti Sains Malaysia, Gelugor 11800, Penang, Malaysia
- Department of Computer Science, College of Science and Humanities in Al-Sulail, Prince Sattam bin Abdulaziz University, Kharj 16278, Saudi Arabia
| | - Ahmad Mohamed
- School of Computer Sciences, Universiti Sains Malaysia, Gelugor 11800, Penang, Malaysia
| | - Syaheerah Lutfi
- School of Computer Sciences, Universiti Sains Malaysia, Gelugor 11800, Penang, Malaysia
| |
Collapse
|
7
|
Stanić V, Žnidarič T, Repovš G, Geršak G. Dynamic Seat Assessment for Enabled Restlessness of Children with Learning Difficulties. SENSORS (BASEL, SWITZERLAND) 2022; 22:3170. [PMID: 35590861 PMCID: PMC9099863 DOI: 10.3390/s22093170] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 03/15/2022] [Revised: 04/15/2022] [Accepted: 04/19/2022] [Indexed: 06/15/2023]
Abstract
Children with Attention-Deficit/Hyperactivity Disorder (ADHD) face a range of learning difficulties in the school environment, thus several strategies have been developed to enhance or optimise their performance in school. One possible way is to actively enable appropriate restlessness using dynamic seats. In this paper, an assessment of the efficacy of a dynamic seat while solving school task is presented and compared to classic chair and therapy ball. To test the effectiveness of active seat, a study that examined task solving performance while observing the intensity of movement, in-seat behaviour and psychophysiological responses (electrodermal activity, facial temperature) was designed. A total of 23 school-aged children participated in the study, 11 children with a combined type of ADHD and 12 children without disorders. Children with ADHD achieved the best results when sitting in the active seat, where the most intense movement and best in-seat behaviour was observed. At the same time, psychophysiological parameters indicate that when performing better at the task children with ADHD were not too challenged and were consequently less agitated. Results have suggested that for a better cognitive performance of children with ADHD, it is crucial to provide a comfortable and pleasant workspace that enables them the right amount of restlessness.
Collapse
Affiliation(s)
- Valentina Stanić
- Faculty of Electrical Engineering, University of Ljubljana, 1000 Ljubljana, Slovenia;
| | - Taja Žnidarič
- Department of Psychology, Faculty of Arts, University of Ljubljana, 1000 Ljubljana, Slovenia; (T.Ž.); (G.R.)
| | - Grega Repovš
- Department of Psychology, Faculty of Arts, University of Ljubljana, 1000 Ljubljana, Slovenia; (T.Ž.); (G.R.)
| | - Gregor Geršak
- Faculty of Electrical Engineering, University of Ljubljana, 1000 Ljubljana, Slovenia;
| |
Collapse
|
8
|
Rudokaite J, Ong LLS, P Janssen M, Postma E, Huis In 't Veld E. Predicting vasovagal reactions to a virtual blood donation using facial image analysis. Transfusion 2022; 62:838-847. [PMID: 35191034 PMCID: PMC9306567 DOI: 10.1111/trf.16832] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/24/2021] [Revised: 01/31/2022] [Accepted: 01/31/2022] [Indexed: 01/13/2023]
Abstract
Background People with needle fear experience not only anxiety and stress but also vasovagal reactions (VVR), including nausea, dizziness, sweating, pallor changes, or even fainting. However, the mechanism behind needle fear and the VVR response are not yet well understood. The aim of our study was to explore whether fluctuations in facial temperature in several facial regions are related to the level of experienced vasovagal reactions, in a simulated blood donation. Study design and methods We recruited 45 students at Tilburg University and filmed them throughout a virtual blood donation procedure using an Infrared Thermal Imaging (ITI) camera. Participants reported their fear of needles and level of experienced vasovagal reactions. ITI data pre‐processing was completed on each video frame by detecting facial landmarks and image alignment before extracting the mean temperature from the six regions of interest. Results Temperatures of the chin and left and right cheek areas increased during the virtual blood donation. Mixed‐effects linear regression showed a significant association between self‐reported vasovagal reactions and temperature fluctuations in the area below the nose. Discussion Our results suggest that the area below the nose may be an interesting target for measuring vasovagal reactions using video imaging techniques. This is the first in a line of studies, which assess whether it is possible to automatically detect levels of fear and vasovagal reactions using facial imaging, from which the development of e‐health solutions and interventions can benefit.
Collapse
Affiliation(s)
- Judita Rudokaite
- Department of Cognitive Science and Artificial Intelligence, Tilburg University, Tilburg, the Netherlands.,Department of Donor Medicine Research, Sanquin, Amsterdam, the Netherlands
| | - Lee-Ling Sharon Ong
- Department of Cognitive Science and Artificial Intelligence, Tilburg University, Tilburg, the Netherlands
| | - Mart P Janssen
- Department of Donor Medicine Research, Sanquin, Amsterdam, the Netherlands
| | - Eric Postma
- Department of Cognitive Science and Artificial Intelligence, Tilburg University, Tilburg, the Netherlands
| | - Elisabeth Huis In 't Veld
- Department of Cognitive Science and Artificial Intelligence, Tilburg University, Tilburg, the Netherlands.,Department of Donor Medicine Research, Sanquin, Amsterdam, the Netherlands
| |
Collapse
|
9
|
Gioia F, Greco A, Callara AL, Scilingo EP. Towards a Contactless Stress Classification Using Thermal Imaging. SENSORS 2022; 22:s22030976. [PMID: 35161722 PMCID: PMC8839779 DOI: 10.3390/s22030976] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/30/2021] [Revised: 01/22/2022] [Accepted: 01/25/2022] [Indexed: 11/16/2022]
Abstract
Thermal cameras capture the infrared radiation emitted from a body in a contactless manner and can provide an indirect estimation of the autonomic nervous system (ANS) dynamics through the regulation of the skin temperature. This study investigates the contribution given by thermal imaging for an effective automatic stress detection with the perspective of a contactless stress recognition system. To this aim, we recorded both ANS correlates (cardiac, electrodermal, and respiratory activity) and thermal images from 25 volunteers under acute stress induced by the Stroop test. We conducted a statistical analysis on the features extracted from each signal, and we implemented subject-independent classifications based on the support vector machine model with an embedded recursive feature elimination algorithm. Particularly, we trained three classifiers using different feature sets: the full set of features, only those derived from the peripheral autonomic correlates, and only those derived from the thermal images. Classification accuracy and feature selection results confirmed the relevant contribution provided by the thermal features in the acute stress detection task. Indeed, a combination of ANS correlates and thermal features achieved 97.37% of accuracy. Moreover, using only thermal features we could still successfully detect stress with an accuracy of 86.84% in a contact-free manner.
Collapse
Affiliation(s)
- Federica Gioia
- Dipartimento di Ingegneria dell’Informazione, University of Pisa, 56122 Pisa, Italy; (A.G.); (A.L.C.); (E.P.S.)
- Research Center “E. Piaggio”, University of Pisa, 56122 Pisa, Italy
- Correspondence:
| | - Alberto Greco
- Dipartimento di Ingegneria dell’Informazione, University of Pisa, 56122 Pisa, Italy; (A.G.); (A.L.C.); (E.P.S.)
- Research Center “E. Piaggio”, University of Pisa, 56122 Pisa, Italy
| | - Alejandro Luis Callara
- Dipartimento di Ingegneria dell’Informazione, University of Pisa, 56122 Pisa, Italy; (A.G.); (A.L.C.); (E.P.S.)
- Research Center “E. Piaggio”, University of Pisa, 56122 Pisa, Italy
| | - Enzo Pasquale Scilingo
- Dipartimento di Ingegneria dell’Informazione, University of Pisa, 56122 Pisa, Italy; (A.G.); (A.L.C.); (E.P.S.)
- Research Center “E. Piaggio”, University of Pisa, 56122 Pisa, Italy
| |
Collapse
|
10
|
Uluer P, Kose H, Gumuslu E, Barkana DE. Experience with an Affective Robot Assistant for Children with Hearing Disabilities. Int J Soc Robot 2021; 15:643-660. [PMID: 34804256 PMCID: PMC8594648 DOI: 10.1007/s12369-021-00830-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/10/2021] [Indexed: 01/10/2023]
Abstract
This study presents an assistive robotic system enhanced with emotion recognition capabilities for children with hearing disabilities. The system is designed and developed for the audiometry tests and rehabilitation of children in a clinical setting and includes a social humanoid robot (Pepper), an interactive interface, gamified audiometry tests, sensory setup and a machine/deep learning based emotion recognition module. Three scenarios involving conventional setup, tablet setup and setup with the robot+tablet are evaluated with 16 children having cochlear implant or hearing aid. Several machine learning techniques and deep learning models are used for the classification of the three test setups and for the classification of the emotions (pleasant, neutral, unpleasant) of children using the recorded physiological signals by E4 wristband. The results show that the collected signals during the tests can be separated successfully and the positive and negative emotions of children can be better distinguished when they interact with the robot than in the other two setups. In addition, the children’s objective and subjective evaluations as well as their impressions about the robot and its emotional behaviors are analyzed and discussed extensively.
Collapse
Affiliation(s)
- Pinar Uluer
- Department of Computer Engineering, Galatasaray University, Istanbul, Turkey.,Department of AI and Data Engineering, Istanbul Technical University, Istanbul, Turkey
| | - Hatice Kose
- Department of AI and Data Engineering, Istanbul Technical University, Istanbul, Turkey
| | - Elif Gumuslu
- Department of Electrical and Electronics Engineering, Yeditepe University, Istanbul, Turkey
| | - Duygun Erol Barkana
- Department of Electrical and Electronics Engineering, Yeditepe University, Istanbul, Turkey
| |
Collapse
|
11
|
Bhattacharyya A, Chatterjee S, Sen S, Sinitca A, Kaplun D, Sarkar R. A deep learning model for classifying human facial expressions from infrared thermal images. Sci Rep 2021; 11:20696. [PMID: 34667253 PMCID: PMC8526608 DOI: 10.1038/s41598-021-99998-z] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 07/02/2021] [Accepted: 10/05/2021] [Indexed: 11/15/2022] Open
Abstract
The analysis of human facial expressions from the thermal images captured by the Infrared Thermal Imaging (IRTI) cameras has recently gained importance compared to images captured by the standard cameras using light having a wavelength in the visible spectrum. It is because infrared cameras work well in low-light conditions and also infrared spectrum captures thermal distribution that is very useful for building systems like Robot interaction systems, quantifying the cognitive responses from facial expressions, disease control, etc. In this paper, a deep learning model called IRFacExNet (InfraRed Facial Expression Network) has been proposed for facial expression recognition (FER) from infrared images. It utilizes two building blocks namely Residual unit and Transformation unit which extract dominant features from the input images specific to the expressions. The extracted features help to detect the emotion of the subjects in consideration accurately. The Snapshot ensemble technique is adopted with a Cosine annealing learning rate scheduler to improve the overall performance. The performance of the proposed model has been evaluated on a publicly available dataset, namely IRDatabase developed by RWTH Aachen University. The facial expressions present in the dataset are Fear, Anger, Contempt, Disgust, Happy, Neutral, Sad, and Surprise. The proposed model produces 88.43% recognition accuracy, better than some state-of-the-art methods considered here for comparison. Our model provides a robust framework for the detection of accurate expression in the absence of visible light.
Collapse
Affiliation(s)
| | - Somnath Chatterjee
- Computer Science and Engineering Department, Future Institute of Engineering and Management, Kolkata, India
| | - Shibaprasad Sen
- Computer Science and Technology Department, University of Engineering and Management, Kolkata, India
| | - Aleksandr Sinitca
- Department of Automation and Control Processes, Saint Petersburg Electrotechnical University "LETI", Saint Petersburg, Russia
| | - Dmitrii Kaplun
- Department of Automation and Control Processes, Saint Petersburg Electrotechnical University "LETI", Saint Petersburg, Russia.
| | - Ram Sarkar
- Department of Computer Science and Engineering, Jadavpur University, Kolkata, India
| |
Collapse
|
12
|
The use of infrared thermal imaging in tonometry with a Scheimpflug camera. J Therm Biol 2021; 96:102823. [PMID: 33627263 DOI: 10.1016/j.jtherbio.2020.102823] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/15/2020] [Revised: 12/18/2020] [Accepted: 12/21/2020] [Indexed: 11/22/2022]
Abstract
Infrared thermal imaging is currently used in almost every field of medicine. This paper presents the novel use of thermography in ophthalmology - using a thermal camera to assess correct intraocular pressure measurement depending on the position of the patient's head during non-contact tonometry. For the analysed group of 10 healthy subjects, thermographic images of the face were recorded before and after intraocular pressure testing. Pressure was tested with a non-contact tonometer with a Scheimpflug camera. For the acquired 20: 2D images (thermograms), an analysis of the characteristic areas of the face determined temperature changes of the patient's face in contact with the tonometer frame. Analysis and processing of the acquired thermograms was carried out in MATLAB® with the Image Processing Toolbox. The results clearly showed a decrease in the patient's face temperature where the face was in contact with tonometer supports. Temperature changes in the patient's face provide valuable information about the correct position of their head in the device, which directly translates into measurement quality. Therefore, the analysis of changes in the patient's face temperature both before and after the examination can be a tool for assessing correct patient positioning in the tonometer supports.
Collapse
|
13
|
Spezialetti M, Placidi G, Rossi S. Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives. Front Robot AI 2020; 7:532279. [PMID: 33501307 PMCID: PMC7806093 DOI: 10.3389/frobt.2020.532279] [Citation(s) in RCA: 22] [Impact Index Per Article: 4.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/03/2020] [Accepted: 09/18/2020] [Indexed: 12/11/2022] Open
Abstract
A fascinating challenge in the field of human-robot interaction is the possibility to endow robots with emotional intelligence in order to make the interaction more intuitive, genuine, and natural. To achieve this, a critical point is the capability of the robot to infer and interpret human emotions. Emotion recognition has been widely explored in the broader fields of human-machine interaction and affective computing. Here, we report recent advances in emotion recognition, with particular regard to the human-robot interaction context. Our aim is to review the state of the art of currently adopted emotional models, interaction modalities, and classification strategies and offer our point of view on future developments and critical issues. We focus on facial expressions, body poses and kinematics, voice, brain activity, and peripheral physiological responses, also providing a list of available datasets containing data from these modalities.
Collapse
Affiliation(s)
- Matteo Spezialetti
- PRISCA (Intelligent Robotics and Advanced Cognitive System Projects) Laboratory, Department of Electrical Engineering and Information Technology (DIETI), University of Naples Federico II, Naples, Italy
- Department of Information Engineering, Computer Science and Mathematics, University of L'Aquila, L'Aquila, Italy
| | - Giuseppe Placidi
- AVI (Acquisition, Analysis, Visualization & Imaging Laboratory) Laboratory, Department of Life, Health and Environmental Sciences (MESVA), University of L'Aquila, L'Aquila, Italy
| | - Silvia Rossi
- PRISCA (Intelligent Robotics and Advanced Cognitive System Projects) Laboratory, Department of Electrical Engineering and Information Technology (DIETI), University of Naples Federico II, Naples, Italy
| |
Collapse
|
14
|
Csoltova E, Mehinagic E. Where Do We Stand in the Domestic Dog ( Canis familiaris ) Positive-Emotion Assessment: A State-of-the-Art Review and Future Directions. Front Psychol 2020; 11:2131. [PMID: 33013543 PMCID: PMC7506079 DOI: 10.3389/fpsyg.2020.02131] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/10/2020] [Accepted: 07/30/2020] [Indexed: 12/19/2022] Open
Abstract
Although there have been a growing number of studies focusing on dog welfare, the research field concerning dog positive-emotion assessment remains mostly unexplored. This paper aims to provide a state-of-the-art review and summary of the scattered and disperse research on dog positive-emotion assessment. The review notably details the current advancement in dog positive-emotion research, what approaches, measures, methods, and techniques have been implemented so far in emotion perception, processing, and response assessment. Moreover, we propose possible future research directions for short-term emotion as well as longer-term emotional states assessment in dogs. The review ends by identifying and addressing some methodological limitations and by pointing out further methodological research needs.
Collapse
|
15
|
Marqués-Sánchez P, Liébana-Presa C, Benítez-Andrades JA, Gundín-Gallego R, Álvarez-Barrio L, Rodríguez-Gonzálvez P. Thermal Infrared Imaging to Evaluate Emotional Competences in Nursing Students: A First Approach through a Case Study. SENSORS 2020; 20:s20092502. [PMID: 32354094 PMCID: PMC7248891 DOI: 10.3390/s20092502] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/08/2020] [Revised: 04/25/2020] [Accepted: 04/27/2020] [Indexed: 12/18/2022]
Abstract
During university studies of nursing, it is important to develop emotional skills for their impact on academic performance and the quality of patient care. Thermography is a technology that could be applied during nursing training to evaluate emotional skills. The objective is to evaluate the effect of thermography as the tool for monitoring and improving emotional skills in student nurses through a case study. The student was subjected to different emotions. The stimuli applied were video and music. The process consisted of measuring the facial temperatures during each emotion and stimulus in three phases: acclimatization, stimulus, and response. Thermographic data acquisition was performed with an FLIR E6 camera. The analysis was complemented with the environmental data (temperature and humidity). With the video stimulus, the start and final forehead temperature from testing phases, showed a different behavior between the positive (joy: 34.5 °C-34.5 °C) and negative (anger: 36.1 °C-35.1 °C) emotions during the acclimatization phase, different from the increase experienced in the stimulus (joy: 34.7 °C-35.0 °C and anger: 35.0 °C-35.0 °C) and response phases (joy: 35.0 °C-35.0 °C and anger: 34.8 °C-35.0 °C). With the music stimulus, the emotions showed different patterns in each phase (joy: 34.2 °C-33.9 °C-33.4 °C and anger: 33.8 °C-33.4 °C-33.8 °C). Whenever the subject is exposed to a stimulus, there is a thermal bodily response. All of the facial areas follow a common thermal pattern in response to the stimulus, with the exception of the nose. Thermography is a technique suitable for the stimulation practices in emotional skills, given that it is non-invasive, it is quantifiable, and easy to access.
Collapse
Affiliation(s)
- Pilar Marqués-Sánchez
- SALBIS Research Group, Faculty of Health Sciences, Campus of Ponferrada, University of León, 24401 Ponferrada, Spain;
| | - Cristina Liébana-Presa
- SALBIS Research Group, Faculty of Health Sciences, Campus of Ponferrada, University of León, 24401 Ponferrada, Spain;
- Correspondence:
| | - José Alberto Benítez-Andrades
- SALBIS Research Group, Department of Electric, Systems and Automatics Engineering, University of León, 24071 León, Spain;
| | | | - Lorena Álvarez-Barrio
- Department of Nursing and Physiotherapy, Faculty of Health Sciences, Campus of Ponferrada, University of León, 24401 Ponferrada, Spain;
| | - Pablo Rodríguez-Gonzálvez
- Department of Mining, Surveying and Structure, Campus of Ponferrada, University of León, 24401 Ponferrada, Spain;
| |
Collapse
|
16
|
Thermal Infrared Imaging-Based Affective Computing and Its Application to Facilitate Human Robot Interaction: A Review. APPLIED SCIENCES-BASEL 2020. [DOI: 10.3390/app10082924] [Citation(s) in RCA: 44] [Impact Index Per Article: 8.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 01/08/2023]
Abstract
Over recent years, robots are increasingly being employed in several aspects of modern society. Among others, social robots have the potential to benefit education, healthcare, and tourism. To achieve this purpose, robots should be able to engage humans, recognize users’ emotions, and to some extent properly react and "behave" in a natural interaction. Most robotics applications primarily use visual information for emotion recognition, which is often based on facial expressions. However, the display of emotional states through facial expression is inherently a voluntary controlled process that is typical of human–human interaction. In fact, humans have not yet learned to use this channel when communicating with a robotic technology. Hence, there is an urgent need to exploit emotion information channels not directly controlled by humans, such as those that can be ascribed to physiological modulations. Thermal infrared imaging-based affective computing has the potential to be the solution to such an issue. It is a validated technology that allows the non-obtrusive monitoring of physiological parameters and from which it might be possible to infer affective states. This review is aimed to outline the advantages and the current research challenges of thermal imaging-based affective computing for human–robot interaction.
Collapse
|
17
|
Visual and Thermal Image Processing for Facial Specific Landmark Detection to Infer Emotions in a Child-Robot Interaction. SENSORS 2019; 19:s19132844. [PMID: 31248004 PMCID: PMC6650968 DOI: 10.3390/s19132844] [Citation(s) in RCA: 28] [Impact Index Per Article: 4.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/20/2019] [Revised: 06/21/2019] [Accepted: 06/22/2019] [Indexed: 11/17/2022]
Abstract
Child-Robot Interaction (CRI) has become increasingly addressed in research and applications. This work proposes a system for emotion recognition in children, recording facial images by both visual (RGB-red, green and blue) and Infrared Thermal Imaging (IRTI) cameras. For this purpose, the Viola-Jones algorithm is used on color images to detect facial regions of interest (ROIs), which are transferred to the thermal camera plane by multiplying a homography matrix obtained through the calibration process of the camera system. As a novelty, we propose to compute the error probability for each ROI located over thermal images, using a reference frame manually marked by a trained expert, in order to choose that ROI better placed according to the expert criteria. Then, this selected ROI is used to relocate the other ROIs, increasing the concordance with respect to the reference manual annotations. Afterwards, other methods for feature extraction, dimensionality reduction through Principal Component Analysis (PCA) and pattern classification by Linear Discriminant Analysis (LDA) are applied to infer emotions. The results show that our approach for ROI locations may track facial landmarks with significant low errors with respect to the traditional Viola-Jones algorithm. These ROIs have shown to be relevant for recognition of five emotions, specifically disgust, fear, happiness, sadness, and surprise, with our recognition system based on PCA and LDA achieving mean accuracy (ACC) and Kappa values of 85.75% and 81.84%, respectively. As a second stage, the proposed recognition system was trained with a dataset of thermal images, collected on 28 typically developing children, in order to infer one of five basic emotions (disgust, fear, happiness, sadness, and surprise) during a child-robot interaction. The results show that our system can be integrated to a social robot to infer child emotions during a child-robot interaction.
Collapse
|