1
|
Nakamura F, Murakami M, Suzuki K, Fukuoka M, Masai K, Sugimoto M. Analyzing the Effect of Diverse Gaze and Head Direction on Facial Expression Recognition With Photo-Reflective Sensors Embedded in a Head-Mounted Display. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS 2023; 29:4124-4139. [PMID: 35653450 DOI: 10.1109/tvcg.2022.3179766] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
As one of the facial expression recognition techniques for Head-Mounted Display (HMD) users, embedded photo-reflective sensors have been used. In this paper, we investigate how gaze and face directions affect facial expression recognition using the embedded photo-reflective sensors. First, we collected a dataset of five facial expressions (Neutral, Happy, Angry, Sad, Surprised) while looking in diverse directions by moving 1) the eyes and 2) the head. Using the dataset, we analyzed the effect of gaze and face directions by constructing facial expression classifiers in five ways and evaluating the classification accuracy of each classifier. The results revealed that the single classifier that learned the data for all gaze points achieved the highest classification performance. Then, we investigated which facial part was affected by the gaze and face direction. The results showed that the gaze directions affected the upper facial parts, while the face directions affected the lower facial parts. In addition, by removing the bias of facial expression reproducibility, we investigated the pure effect of gaze and face directions in three conditions. The results showed that, in terms of gaze direction, building classifiers for each direction significantly improved the classification accuracy. However, in terms of face directions, there were slight differences between the classifier conditions. Our experimental results implied that multiple classifiers corresponding to multiple gaze and face directions improved facial expression recognition accuracy, but collecting the data of the vertical movement of gaze and face is a practical solution to improving facial expression recognition accuracy.
Collapse
|
2
|
Bravo BSF, de Melo Carvalho R, Penedo L, de Bastos JT, Calomeni Elias M, Cotofana S, Frank K, Moellhoff N, Freitag L, Alfertshofer M. Applied anatomy of the layers and soft tissues of the forehead during minimally-invasive aesthetic procedures. J Cosmet Dermatol 2022; 21:5864-5871. [PMID: 35634970 DOI: 10.1111/jocd.15131] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2022] [Revised: 04/21/2022] [Accepted: 05/25/2022] [Indexed: 12/27/2022]
Abstract
BACKGROUND An increasing demand of minimally-invasive aesthetic procedures of the forehead concomitantly leads to higher numbers of adverse events. Adequate application of anatomical knowledge is required to increase safety and efficacy of different minimally-invasive aesthetic procedures in this anatomical region. OBJECTIVE To describe the layered anatomy of the forehead soft tissues with respect to their thicknesses and how they relate to different minimally-invasive aesthetic treatments. METHODS A total of n = 85 healthy study participants (69 females and 16 males) with a mean age of 40.84 ± 10.9 years and a mean body mass index of 22.65 ± 2.6 kg/m2 were investigated with ultrasound-based imaging to measure the thickness of different forehead soft tissues. RESULTS The mean overall soft tissue thickness of the forehead was measured to be 4.18 ± 0.7 mm for the entire study population. Increasing BMI values correlated statistically significantly with increasing thickness of all measured forehead soft tissues with exception of the frontalis muscle. On a statistically significant level, males showed thicker forehead soft tissues than females, with exception of the retrofrontalis fat and the frontalis muscle. CONCLUSION On basis of the findings obtained in this study, basic treatment principles can be derived and improved for the injection of neuromodulators, hyaluronic acid as well as the application of polydiaxonane (PDO) threads and micro-focused ultrasound. Precise knowledge and thorough understanding of the layers and soft tissues of the forehead is required to guarantee safe and effective procedures in this aesthetically important facial region.
Collapse
Affiliation(s)
| | | | - Lais Penedo
- Dermatology Department, Bravo Private Clinic, Rio de Janeiro, Brazil
| | | | | | - Sebastian Cotofana
- Department of Clinical Anatomy, Mayo Clinic College of Medicine and Science, Rochester, Minnesota, USA
| | - Konstantin Frank
- Division of Hand, Plastic and Aesthetic Surgery, University Hospital, LMU, Munich, Germany
| | - Nicholas Moellhoff
- Division of Hand, Plastic and Aesthetic Surgery, University Hospital, LMU, Munich, Germany
| | - Lysander Freitag
- Department of General Surgery, Community Hospital Havelhöhe, Berlin, Germany
| | - Michael Alfertshofer
- Division of Hand, Plastic and Aesthetic Surgery, University Hospital, LMU, Munich, Germany
| |
Collapse
|
3
|
Martin-Niedecken AL, Schwarz T, Schättin A. Comparing the Impact of Heart Rate-Based In-Game Adaptations in an Exergame-Based Functional High-Intensity Interval Training on Training Intensity and Experience in Healthy Young Adults. Front Psychol 2021; 12:572877. [PMID: 34234705 PMCID: PMC8255375 DOI: 10.3389/fpsyg.2021.572877] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/15/2020] [Accepted: 03/05/2021] [Indexed: 11/13/2022] Open
Abstract
Physical inactivity remains one of the biggest societal challenges of the 21st century. The gaming industry and the fitness sector have responded to this alarming fact with game-based or gamified training scenarios and thus established the promising trend of exergaming. Exergames—games played with the (whole) body as physical input—have been extolled as potential attractive and effective training tools. Simultaneously, researchers and designers are still exploring new approaches to exploit the full potential of this innovative and enjoyable training method. One way to boost the attractiveness and effectiveness of an exergame is to individualize it with game adaptations. A physiological parameter that is often used to balance the physical challenge and intensity of exergames to the player’s fitness skills is the heart rate (HR). Therefore, researchers and designers often rely on age-based, maximum HR (HRmax) formulas originating from performance diagnostics. In combination with the player’s assessed real-time HR during an exergame session, the pre-determined HRmax is used to adapt the game’s challenge to reach a pre-defined HR and physical intensity level (in-exergame adaptations), respectively. Although the validity and reliability of these age-based HRmax formulas were proven in heterogeneous target populations, their use is still often criticized as HR is an individual parameter that is affected by various internal and external factors. So far, no study has investigated whether the formula-based pre-calculated HRmax compared to a standardized individually pre-assessed HRmax elicits different training intensities, training experiences, and flow feelings in an exergame. Therefore, we compared both variants for in-exergame adaptation with the ExerCube – a functional high-intensity interval training exergame – in healthy young adults. Comparing the results of the two conditions, no significant differences were found for HR parameters and perceived physical and cognitive exertion, nor for overall flow feelings and physical activity enjoyment. Thus, the formula-based in-exergame adaptation approach was suitable in the presented study population, and the ExerCube provided an equally reliable in-exergame adaptation and comparable exergame play experiences. We discuss our findings in the context of related work on exergame adaptation approaches and draw out some implications for future adaptive exergame design and research topics.
Collapse
Affiliation(s)
| | - Tiziana Schwarz
- Motor Control and Learning, Institute of Human Movement Sciences and Sport, Department of Health Sciences and Technology, ETH Zürich, Zurich, Switzerland
| | - Alexandra Schättin
- Motor Control and Learning, Institute of Human Movement Sciences and Sport, Department of Health Sciences and Technology, ETH Zürich, Zurich, Switzerland
| |
Collapse
|
4
|
On the Use of Movement-Based Interaction with Smart Textiles for Emotion Regulation. SENSORS 2021; 21:s21030990. [PMID: 33540608 PMCID: PMC7867248 DOI: 10.3390/s21030990] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/28/2020] [Revised: 01/23/2021] [Accepted: 01/27/2021] [Indexed: 11/30/2022]
Abstract
Research from psychology has suggested that body movement may directly activate emotional experiences. Movement-based emotion regulation is the most readily available but often underutilized strategy for emotion regulation. This research aims to investigate the emotional effects of movement-based interaction and its sensory feedback mechanisms. To this end, we developed a smart clothing prototype, E-motionWear, which reacts to four movements (elbow flexion/extension, shoulder flexion/extension, open and closed arms, neck flexion/extension), fabric-based detection sensors, and three-movement feedback mechanisms (audio, visual and vibrotactile). An experiment was conducted using a combined qualitative and quantitative approach to collect participants’ objective and subjective emotional feelings. Results indicate that there was no interaction effect between movement and feedback mechanism on the final emotional results. Participants preferred vibrotactile and audio feedback rather than visual feedback when performing these four kinds of upper body movements. Shoulder flexion/extension and open-closed arm movements were more effective for improving positive emotion than elbow flexion/extension movements. Participants thought that the E-motionWear prototype were comfortable to wear and brought them new emotional experiences. From these results, a set of guidelines were derived that can help frame the design and use of smart clothing to support users’ emotional regulation.
Collapse
|
5
|
Bello H, Zhou B, Lukowicz P. Facial Muscle Activity Recognition with Reconfigurable Differential Stethoscope-Microphones. SENSORS 2020; 20:s20174904. [PMID: 32872633 PMCID: PMC7506891 DOI: 10.3390/s20174904] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/28/2020] [Revised: 08/21/2020] [Accepted: 08/27/2020] [Indexed: 12/02/2022]
Abstract
Many human activities and states are related to the facial muscles’ actions: from the expression of emotions, stress, and non-verbal communication through health-related actions, such as coughing and sneezing to nutrition and drinking. In this work, we describe, in detail, the design and evaluation of a wearable system for facial muscle activity monitoring based on a re-configurable differential array of stethoscope-microphones. In our system, six stethoscopes are placed at locations that could easily be integrated into the frame of smart glasses. The paper describes the detailed hardware design and selection and adaptation of appropriate signal processing and machine learning methods. For the evaluation, we asked eight participants to imitate a set of facial actions, such as expressions of happiness, anger, surprise, sadness, upset, and disgust, and gestures, like kissing, winkling, sticking the tongue out, and taking a pill. An evaluation of a complete data set of 2640 events with 66% training and a 33% testing rate has been performed. Although we encountered high variability of the volunteers’ expressions, our approach shows a recall = 55%, precision = 56%, and f1-score of 54% for the user-independent scenario(9% chance-level). On a user-dependent basis, our worst result has an f1-score = 60% and best result with f1-score = 89%. Having a recall ≥60% for expressions like happiness, anger, kissing, sticking the tongue out, and neutral(Null-class).
Collapse
Affiliation(s)
- Hymalai Bello
- German Research Center for Artificial Intelligence(DFKI), 67663 Kaiserslautern, Germany; (B.Z.); (P.L.)
- Correspondence:
| | - Bo Zhou
- German Research Center for Artificial Intelligence(DFKI), 67663 Kaiserslautern, Germany; (B.Z.); (P.L.)
| | - Paul Lukowicz
- German Research Center for Artificial Intelligence(DFKI), 67663 Kaiserslautern, Germany; (B.Z.); (P.L.)
- Department of Computer Science, University of Kaiserslautern, 67663 Kaiserslautern, Germany
| |
Collapse
|
6
|
Raheel A, Majid M, Alnowami M, Anwar SM. Physiological Sensors Based Emotion Recognition While Experiencing Tactile Enhanced Multimedia. SENSORS 2020; 20:s20144037. [PMID: 32708056 PMCID: PMC7411620 DOI: 10.3390/s20144037] [Citation(s) in RCA: 17] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/29/2020] [Revised: 05/12/2020] [Accepted: 05/14/2020] [Indexed: 12/18/2022]
Abstract
Emotion recognition has increased the potential of affective computing by getting an instant feedback from users and thereby, have a better understanding of their behavior. Physiological sensors have been used to recognize human emotions in response to audio and video content that engages single (auditory) and multiple (two: auditory and vision) human senses, respectively. In this study, human emotions were recognized using physiological signals observed in response to tactile enhanced multimedia content that engages three (tactile, vision, and auditory) human senses. The aim was to give users an enhanced real-world sensation while engaging with multimedia content. To this end, four videos were selected and synchronized with an electric fan and a heater, based on timestamps within the scenes, to generate tactile enhanced content with cold and hot air effect respectively. Physiological signals, i.e., electroencephalography (EEG), photoplethysmography (PPG), and galvanic skin response (GSR) were recorded using commercially available sensors, while experiencing these tactile enhanced videos. The precision of the acquired physiological signals (including EEG, PPG, and GSR) is enhanced using pre-processing with a Savitzky-Golay smoothing filter. Frequency domain features (rational asymmetry, differential asymmetry, and correlation) from EEG, time domain features (variance, entropy, kurtosis, and skewness) from GSR, heart rate and heart rate variability from PPG data are extracted. The K nearest neighbor classifier is applied to the extracted features to classify four (happy, relaxed, angry, and sad) emotions. Our experimental results show that among individual modalities, PPG-based features gives the highest accuracy of 78.57% as compared to EEG- and GSR-based features. The fusion of EEG, GSR, and PPG features further improved the classification accuracy to 79.76% (for four emotions) when interacting with tactile enhanced multimedia.
Collapse
Affiliation(s)
- Aasim Raheel
- Department of Computer Engineering, University of Engineering and Technology, Taxila 47050, Pakistan;
| | - Muhammad Majid
- Department of Computer Engineering, University of Engineering and Technology, Taxila 47050, Pakistan;
- Correspondence:
| | - Majdi Alnowami
- Department of Nuclear Engineering, King Abdulaziz University, Jeddah 21589, Saudi Arabia;
| | - Syed Muhammad Anwar
- Department of Software Engineering, University of Engineering and Technology, Taxila 47050, Pakistan;
| |
Collapse
|
7
|
Ismar E, Kurşun Bahadir S, Kalaoglu F, Koncar V. Futuristic Clothes: Electronic Textiles and Wearable Technologies. GLOBAL CHALLENGES (HOBOKEN, NJ) 2020; 4:1900092. [PMID: 32642074 PMCID: PMC7330505 DOI: 10.1002/gch2.201900092] [Citation(s) in RCA: 47] [Impact Index Per Article: 11.8] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/07/2019] [Revised: 02/23/2020] [Accepted: 02/24/2020] [Indexed: 05/22/2023]
Abstract
This review summarizes the recent developments and importance of wearable electronic textiles in the past decade. Wearable electronic textiles are an emerging interdisciplinary research area that requires new design approaches. This challenging interdisciplinary research field brings together specialists in electronics, information technology, microsystems, and textiles to make an innovation in the development of wearable electronic products. Wearable electronic textiles play a key role among various technologies (clothing, communication, information, healthcare monitoring, military, sensors, magnetic shielding, etc.). In this review, applications of wearable electronic textiles are described, including an investigation of their fabrication techniques. This review highlights the basic processes, possible applications, and main materials to build wearable E-textiles and combines the fundamentals of E-textiles for the readers who have different backgrounds. Moreover, reliability, reusability, and efficiency of wearable electronic textiles are discussed together with the opportunities and drawbacks of the wearable E-textiles that are addressed in this review article.
Collapse
Affiliation(s)
- Ezgi Ismar
- Nano Science & Nano EngineeringIstanbul Technical UniversityIstanbul34467Turkey
| | - Senem Kurşun Bahadir
- Department of Mechanical EngineeringIstanbul Technical UniversityIstanbul34437Turkey
| | - Fatma Kalaoglu
- Department of Textile EngineeringIstanbul Technical UniversityIstanbul34437Turkey
| | - Vladan Koncar
- GEMTEXUniversity of LilleCité ScientifiqueVilleneuve d'AscqF‐59650France
- École Nationale Supérieure des Arts et Industries Textiles/Génie et Matériaux Textiles laboratory (ENSAIT/GEMTEX)2 Allée Louis et Victor ChampierRoubaixF‐59100France
| |
Collapse
|