1
|
Wögerbauer EM, von Castell C, Welsch R, Hecht H. Preferred Distance in Human-Drone Interaction. Vision (Basel) 2024; 8:59. [PMID: 39449392 PMCID: PMC11503297 DOI: 10.3390/vision8040059] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2024] [Revised: 09/23/2024] [Accepted: 09/25/2024] [Indexed: 10/26/2024] Open
Abstract
In two augmented-reality experiments, we transferred the paradigm of interpersonal distance regulation to human-drone interaction. In the first experiment, we used a simple spherical drone model and explored how both hovering height and approach angle affect the preferred distance. Drone height above the ground had a strong effect. The preferred distance to the drone was larger than that typically found toward human actors, in particular, when the drone trajectory was very high. In the second experiment, we sought to gain a deeper understanding of the factors that may influence this effect. In addition to the simple spherical drone model used in the first experiment, we also varied its appearance and attachment to the ground. Surprisingly, anthropomorphic features increased preferred distances. We, therefore, discuss the extent to which social aspects and subjectively perceived danger influence the preferred distance for interaction with drones, which thus need to be considered in the design of human-drone interaction.
Collapse
Affiliation(s)
| | - Christoph von Castell
- Department of Psychology, Johannes Gutenberg-University Mainz, 55122 Mainz, Germany; (C.v.C.); (H.H.)
| | - Robin Welsch
- Department of Computer Science, Aalto University, 02150 Espoo, Finland;
| | - Heiko Hecht
- Department of Psychology, Johannes Gutenberg-University Mainz, 55122 Mainz, Germany; (C.v.C.); (H.H.)
| |
Collapse
|
2
|
Lee MS, Lee GE, Lee SH, Lee JH. Emotional responses of Korean and Chinese women to Hangul phonemes to the gender of an artificial intelligence voice. Front Psychol 2024; 15:1357975. [PMID: 39135868 PMCID: PMC11317464 DOI: 10.3389/fpsyg.2024.1357975] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/19/2023] [Accepted: 07/04/2024] [Indexed: 08/15/2024] Open
Abstract
Introduction This study aimed to explore the arousal and valence that people experience in response to Hangul phonemes based on the gender of an AI speaker through comparison with Korean and Chinese cultures. Methods To achieve this, 42 Hangul phonemes were used, in a combination of three Korean vowels and 14 Korean consonants, to explore cultural differences in arousal, valence, and the six foundational emotions based on the gender of an AI speaker. A total 136 Korean and Chinese women were recruited and randomly assigned to one of two conditions based on voice gender (man or woman). Results and discussion This study revealed significant differences in arousal levels between Korean and Chinese women when exposed to male voices. Specifically, Chinese women exhibited clear differences in emotional perceptions of male and female voices in response to voiced consonants. These results confirm that arousal and valence may differ with articulation types and vowels due to cultural differences and that voice gender can affect perceived emotions. This principle can be used as evidence for sound symbolism and has practical implications for voice gender and branding in AI applications.
Collapse
Affiliation(s)
- Min-Sun Lee
- Department of Psychology, Chung-Ang University, Seoul, Republic of Korea
| | - Gi-Eun Lee
- Institute of Cultural Diversity Content, Chung-Ang University, Seoul, Republic of Korea
| | - San Ho Lee
- Department of European Language and Cultures, Chung-Ang University, Seoul, Republic of Korea
| | - Jang-Han Lee
- Department of Psychology, Chung-Ang University, Seoul, Republic of Korea
| |
Collapse
|
3
|
Goshvarpour A. Emerging Trends of Biomedical Signal Processing in Intelligent Emotion Recognition. Brain Sci 2024; 14:628. [PMID: 39061369 PMCID: PMC11274954 DOI: 10.3390/brainsci14070628] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2024] [Accepted: 06/18/2024] [Indexed: 07/28/2024] Open
Abstract
The field of biomedical signal processing has experienced significant advancements in recent years, particularly in the realm of emotion recognition [...].
Collapse
Affiliation(s)
- Ateke Goshvarpour
- Department of Biomedical Engineering, Imam Reza International University, Mashhad 91735-553, Iran
| |
Collapse
|
4
|
Loizou M, Arnab S, Lameras P, Hartley T, Loizides F, Kumar P, Sumilo D. Designing, implementing and testing an intervention of affective intelligent agents in nursing virtual reality teaching simulations-a qualitative study. Front Digit Health 2024; 6:1307817. [PMID: 38698890 PMCID: PMC11063316 DOI: 10.3389/fdgth.2024.1307817] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/05/2023] [Accepted: 04/03/2024] [Indexed: 05/05/2024] Open
Abstract
Emotions play an important role in human-computer interaction, but there is limited research on affective and emotional virtual agent design in the area of teaching simulations for healthcare provision. The purpose of this work is twofold: firstly, to describe the process for designing affective intelligent agents that are engaged in automated communications such as person to computer conversations, and secondly to test a bespoke prototype digital intervention which implements such agents. The presented study tests two distinct virtual learning environments, one of which was enhanced with affective virtual patients, with nine 3rd year nursing students specialising in mental health, during their professional practice stage. All (100%) of the participants reported that, when using the enhanced scenario, they experienced a more realistic representation of carer/patient interaction; better recognition of the patients' feelings; recognition and assessment of emotions; a better realisation of how feelings can affect patients' emotional state and how they could better empathise with the patients.
Collapse
Affiliation(s)
- Michael Loizou
- Centre for Health Technology, University of Plymouth, Plymouth, United Kingdom
| | - Sylvester Arnab
- Centre for Postdigital Cultures, Coventry University, Coventry, United Kingdom
| | - Petros Lameras
- Centre for Postdigital Cultures, Coventry University, Coventry, United Kingdom
| | - Thomas Hartley
- Department of Computer Science, University of Wolverhampton, Wolverhampton, United Kingdom
| | - Fernando Loizides
- School of Computer Science and Informatics, Cardiff University, Cardiff, United Kingdom
| | - Praveen Kumar
- School of Health and Social Wellbeing, University of the West of England, Bristol, United Kingdom
| | - Dana Sumilo
- Warwick Medical School, University of Warwick, Warwick, United Kingdom
| |
Collapse
|
5
|
Heinisch JS, Kirchhoff J, Busch P, Wendt J, von Stryk O, David K. Physiological data for affective computing in HRI with anthropomorphic service robots: the AFFECT-HRI data set. Sci Data 2024; 11:333. [PMID: 38575624 PMCID: PMC10995145 DOI: 10.1038/s41597-024-03128-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/24/2023] [Accepted: 03/05/2024] [Indexed: 04/06/2024] Open
Abstract
In human-human and human-robot interaction, the counterpart influences the human's affective state. Contrary to humans, robots inherently cannot respond empathically, meaning non-beneficial affective reactions cannot be mitigated. Thus, to create a responsible and empathetic human-robot interaction (HRI), involving anthropomorphic service robots, the effect of robot behavior on human affect in HRI must be understood. To contribute to this understanding, we provide the new comprehensive data set AFFECT-HRI, including, for the first time, physiological data labeled with human affect (i.e., emotions and mood) gathered from a conducted HRI study. Within the study, 146 participants interacted with an anthropomorphic service robot in a realistic and complex retail scenario. The participants' questionnaire ratings regarding affect, demographics, and socio-technical ratings are provided in the data set. Five different conditions (i.e., neutral, transparency, liability, moral, and immoral) were considered during the study, eliciting different affective reactions and allowing interdisciplinary investigations (e.g., computer science, law, and psychology). Each condition includes three scenes: a consultation regarding products, a request for sensitive personal information, and a handover.
Collapse
Affiliation(s)
- Judith S Heinisch
- University of Kassel, Chair for Communication Technology, Department of Electrical Engineering and Computer Science, WilhelmsöherAllee 73, 34121, Kassel, Germany.
| | - Jérôme Kirchhoff
- Technical University of Darmstadt, Chair for Simulation, Systems Opimization and Robotics, Department of Computer Science, Hochschulstrasse 10, 64289, Darmstadt, Germany
| | - Philip Busch
- Technical University of Darmstadt, Chair for Civil and Company Law, Department of Law and Economics, Hochschulstrasse 1, 64289, Darmstadt, Germany
| | - Janine Wendt
- Technical University of Darmstadt, Chair for Civil and Company Law, Department of Law and Economics, Hochschulstrasse 1, 64289, Darmstadt, Germany
| | - Oskar von Stryk
- Technical University of Darmstadt, Chair for Simulation, Systems Opimization and Robotics, Department of Computer Science, Hochschulstrasse 10, 64289, Darmstadt, Germany
| | - Klaus David
- University of Kassel, Chair for Communication Technology, Department of Electrical Engineering and Computer Science, WilhelmsöherAllee 73, 34121, Kassel, Germany
| |
Collapse
|
6
|
Jacobs KA. Digital loneliness-changes of social recognition through AI companions. Front Digit Health 2024; 6:1281037. [PMID: 38504806 PMCID: PMC10949182 DOI: 10.3389/fdgth.2024.1281037] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2023] [Accepted: 02/15/2024] [Indexed: 03/21/2024] Open
Abstract
Inherent to the experience of loneliness is a significant change of meaningful relatedness that (usually negatively) affects a person's relationship to self and others. This paper goes beyond a purely subjective-phenomenological description of individual suffering by emphasizing loneliness as a symptomatic expression of distortions of social recognition relations. Where there is loneliness, a recognition relation has changed. Most societies face an increase in loneliness among all groups of their population, and this sheds light on the reproduction conditions of social integration and inclusion. These functions are essential lifeworldly components of social cohesion and wellbeing. This study asks whether "social" AI promotes these societal success goals of social integration of lonely people. The increasing tendency to regard AI Companions (AICs) as reproducers of adequate recognition is critically discussed with this review. My skepticism requires further justification, especially as a large portion of sociopolitical prevention efforts aim to fight an increase of loneliness primarily with digital strategies. I will argue that AICs rather reproduce than sustainably reduce the pathodynamics of loneliness: loneliness gets simply "digitized."
Collapse
Affiliation(s)
- Kerrin Artemis Jacobs
- Department of Philosophy, Ethics, and Religious Studies, Faculty of Humanities and Human Sciences (Graduate School), University of Hokkaido, Sapporo, Japan
- Center for Human Nature, Artificial Intelligence, and Neuroscience (CHAIN), University of Hokkaido, Sapporo, Japan
| |
Collapse
|
7
|
Fiorini L, D'Onofrio G, Sorrentino A, Cornacchia Loizzo FG, Russo S, Ciccone F, Giuliani F, Sancarlo D, Cavallo F. The Role of Coherent Robot Behavior and Embodiment in Emotion Perception and Recognition During Human-Robot Interaction: Experimental Study. JMIR Hum Factors 2024; 11:e45494. [PMID: 38277201 PMCID: PMC10858416 DOI: 10.2196/45494] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/04/2023] [Revised: 04/24/2023] [Accepted: 11/29/2023] [Indexed: 01/27/2024] Open
Abstract
BACKGROUND Social robots are becoming increasingly important as companions in our daily lives. Consequently, humans expect to interact with them using the same mental models applied to human-human interactions, including the use of cospeech gestures. Research efforts have been devoted to understanding users' needs and developing robot's behavioral models that can perceive the user state and properly plan a reaction. Despite the efforts made, some challenges regarding the effect of robot embodiment and behavior in the perception of emotions remain open. OBJECTIVE The aim of this study is dual. First, it aims to assess the role of the robot's cospeech gestures and embodiment in the user's perceived emotions in terms of valence (stimulus pleasantness), arousal (intensity of evoked emotion), and dominance (degree of control exerted by the stimulus). Second, it aims to evaluate the robot's accuracy in identifying positive, negative, and neutral emotions displayed by interacting humans using 3 supervised machine learning algorithms: support vector machine, random forest, and K-nearest neighbor. METHODS Pepper robot was used to elicit the 3 emotions in humans using a set of 60 images retrieved from a standardized database. In particular, 2 experimental conditions for emotion elicitation were performed with Pepper robot: with a static behavior or with a robot that expresses coherent (COH) cospeech behavior. Furthermore, to evaluate the role of the robot embodiment, the third elicitation was performed by asking the participant to interact with a PC, where a graphical interface showed the same images. Each participant was requested to undergo only 1 of the 3 experimental conditions. RESULTS A total of 60 participants were recruited for this study, 20 for each experimental condition for a total of 3600 interactions. The results showed significant differences (P<.05) in valence, arousal, and dominance when stimulated with the Pepper robot behaving COH with respect to the PC condition, thus underlying the importance of the robot's nonverbal communication and embodiment. A higher valence score was obtained for the elicitation of the robot (COH and robot with static behavior) with respect to the PC. For emotion recognition, the K-nearest neighbor classifiers achieved the best accuracy results. In particular, the COH modality achieved the highest level of accuracy (0.97) when compared with the static behavior and PC elicitations (0.88 and 0.94, respectively). CONCLUSIONS The results suggest that the use of multimodal communication channels, such as cospeech and visual channels, as in the COH modality, may improve the recognition accuracy of the user's emotional state and can reinforce the perceived emotion. Future studies should investigate the effect of age, culture, and cognitive profile on the emotion perception and recognition going beyond the limitation of this work.
Collapse
Affiliation(s)
- Laura Fiorini
- Department of Industrial Engineering, University of Florence, Firenze, Italy
- The BioRobotics Institute, Scuola Superiore Sant'Anna, Pontedera (Pisa), Italy
| | - Grazia D'Onofrio
- Clinical Psychology Service, Health Department, Fondazione IRCCS Casa Sollievo della Sofferenza, San Giovanni Rotondo (Foggia), Italy
| | | | | | - Sergio Russo
- Innovation & Research Unit, Fondazione IRCCS Casa Sollievo della Sofferenza, San Giovanni Rotondo (Foggia), Italy
| | - Filomena Ciccone
- Clinical Psychology Service, Health Department, Fondazione IRCCS Casa Sollievo della Sofferenza, San Giovanni Rotondo (Foggia), Italy
| | - Francesco Giuliani
- Innovation & Research Unit, Fondazione IRCCS Casa Sollievo della Sofferenza, San Giovanni Rotondo (Foggia), Italy
| | - Daniele Sancarlo
- Complex Unit of Geriatrics, Department of Medical Sciences, Fondazione IRCCS Casa Sollievo della Sofferenza, San Giovanni Rotondo (Foggia), Italy
| | - Filippo Cavallo
- Department of Industrial Engineering, University of Florence, Firenze, Italy
- The BioRobotics Institute, Scuola Superiore Sant'Anna, Pontedera (Pisa), Italy
| |
Collapse
|
8
|
Li N, Ross R. Invoking and identifying task-oriented interlocutor confusion in human-robot interaction. Front Robot AI 2023; 10:1244381. [PMID: 38054199 PMCID: PMC10694506 DOI: 10.3389/frobt.2023.1244381] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 06/22/2023] [Accepted: 10/31/2023] [Indexed: 12/07/2023] Open
Abstract
Successful conversational interaction with a social robot requires not only an assessment of a user's contribution to an interaction, but also awareness of their emotional and attitudinal states as the interaction unfolds. To this end, our research aims to systematically trigger, but then interpret human behaviors to track different states of potential user confusion in interaction so that systems can be primed to adjust their policies in light of users entering confusion states. In this paper, we present a detailed human-robot interaction study to prompt, investigate, and eventually detect confusion states in users. The study itself employs a Wizard-of-Oz (WoZ) style design with a Pepper robot to prompt confusion states for task-oriented dialogues in a well-defined manner. The data collected from 81 participants includes audio and visual data, from both the robot's perspective and the environment, as well as participant survey data. From these data, we evaluated the correlations of induced confusion conditions with multimodal data, including eye gaze estimation, head pose estimation, facial emotion detection, silence duration time, and user speech analysis-including emotion and pitch analysis. Analysis shows significant differences of participants' behaviors in states of confusion based on these signals, as well as a strong correlation between confusion conditions and participants own self-reported confusion scores. The paper establishes strong correlations between confusion levels and these observable features, and lays the ground or a more complete social and affect oriented strategy for task-oriented human-robot interaction. The contributions of this paper include the methodology applied, dataset, and our systematic analysis.
Collapse
Affiliation(s)
- Na Li
- School of Computer Science, Technological University, Dublin, Ireland
| | | |
Collapse
|
9
|
Staffa M, D'Errico L, Sansalone S, Alimardani M. Classifying human emotions in HRI: applying global optimization model to EEG brain signals. Front Neurorobot 2023; 17:1191127. [PMID: 37881515 PMCID: PMC10595007 DOI: 10.3389/fnbot.2023.1191127] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/21/2023] [Accepted: 08/21/2023] [Indexed: 10/27/2023] Open
Abstract
Significant efforts have been made in the past decade to humanize both the form and function of social robots to increase their acceptance among humans. To this end, social robots have recently been combined with brain-computer interface (BCI) systems in an attempt to give them an understanding of human mental states, particularly emotions. However, emotion recognition using BCIs poses several challenges, such as subjectivity of emotions, contextual dependency, and a lack of reliable neuro-metrics for real-time processing of emotions. Furthermore, the use of BCI systems introduces its own set of limitations, such as the bias-variance trade-off, dimensionality, and noise in the input data space. In this study, we sought to address some of these challenges by detecting human emotional states from EEG brain activity during human-robot interaction (HRI). EEG signals were collected from 10 participants who interacted with a Pepper robot that demonstrated either a positive or negative personality. Using emotion valence and arousal measures derived from frontal brain asymmetry (FBA), several machine learning models were trained to classify human's mental states in response to the robot personality. To improve classification accuracy, all proposed classifiers were subjected to a Global Optimization Model (GOM) based on feature selection and hyperparameter optimization techniques. The results showed that it is possible to classify a user's emotional responses to the robot's behavior from the EEG signals with an accuracy of up to 92%. The outcome of the current study contributes to the first level of the Theory of Mind (ToM) in Human-Robot Interaction, enabling robots to comprehend users' emotional responses and attribute mental states to them. Our work advances the field of social and assistive robotics by paving the way for the development of more empathetic and responsive HRI in the future.
Collapse
Affiliation(s)
- Mariacarla Staffa
- Department of Science and Technology, University of Naples Parthenope, Naples, Italy
| | - Lorenzo D'Errico
- Department of Electrical Engineering and Information Technologies, University of Naples Federico II, Naples, Italy
| | - Simone Sansalone
- Department of Physics, University of Naples Federico II, Naples, Italy
| | - Maryam Alimardani
- Department of Cognitive Science and Artificial Intelligence, Tilburg University, Tilburg, Netherlands
| |
Collapse
|
10
|
Abdollahi H, Mahoor MH, Zandie R, Siewierski J, Qualls SH. Artificial Emotional Intelligence in Socially Assistive Robots for Older Adults: A Pilot Study. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING 2023; 14:2020-2032. [PMID: 37840968 PMCID: PMC10569155 DOI: 10.1109/taffc.2022.3143803] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 10/17/2023]
Abstract
This paper presents our recent research on integrating artificial emotional intelligence in a social robot (Ryan) and studies the robot's effectiveness in engaging older adults. Ryan is a socially assistive robot designed to provide companionship for older adults with depression and dementia through conversation. We used two versions of Ryan for our study, empathic and non-empathic. The empathic Ryan utilizes a multimodal emotion recognition algorithm and a multimodal emotion expression system. Using different input modalities for emotion, i.e. facial expression and speech sentiment, the empathic Ryan detects users emotional state and utilizes an affective dialogue manager to generate a response. On the other hand, the non-empathic Ryan lacks facial expression and uses scripted dialogues that do not factor in the users emotional state. We studied these two versions of Ryan with 10 older adults living in a senior care facility. The statistically significant improvement in the users' reported face-scale mood measurement indicates an overall positive effect from the interaction with both the empathic and non-empathic versions of Ryan. However, the number of spoken words measurement and the exit survey analysis suggest that the users perceive the empathic Ryan as more engaging and likable.
Collapse
Affiliation(s)
- Hojjat Abdollahi
- Department of Electrical and Computer Engineering, University of Denver
- DreamFace Technologies, LLC
| | - Mohammad H Mahoor
- Department of Electrical and Computer Engineering, University of Denver
- DreamFace Technologies, LLC
| | - Rohola Zandie
- Department of Electrical and Computer Engineering, University of Denver
- DreamFace Technologies, LLC
| | | | - Sara H Qualls
- Department of Psychology and Gerontology Center, University of Colorado, Colorado Springs
| |
Collapse
|
11
|
Lambiase PD, Rossi A, Rossi S. A Two-Tier GAN Architecture for Conditioned Expressions Synthesis on Categorical Emotions. Int J Soc Robot 2023. [DOI: 10.1007/s12369-023-00973-7] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/12/2023]
Abstract
AbstractEmotions are an effective communication mode during human–human and human–robot interactions. However, while humans can easily understand other people’s emotions, and they are able to show emotions with natural facial expressions, robot-simulated emotions still represent an open challenge also due to a lack of naturalness and variety of possible expressions. In this direction, we present a two-tier Generative Adversarial Networks (GAN) architecture that generates facial expressions starting from categorical emotions (e.g. joy, sadness, etc.) to obtain a variety of synthesised expressions for each emotion. The proposed approach combines the key features of Conditional Generative Adversarial Networks (CGAN) and GANimation, overcoming their limits by allowing fine modelling of facial expressions, and generating a wide range of expressions for each class (i.e., discrete emotion). The architecture is composed of two modules for generating a synthetic Action Units (AU, i.e., a coding mechanism representing facial muscles and their activation) vector conditioned on a given emotion, and for applying an AU vector to a given image. The overall model is capable of modifying an image of a human face by modelling the facial expression to show a specific discrete emotion. Qualitative and quantitative measurements have been performed to evaluate the ability of the network to generate a variety of expressions that are consistent with the conditioned emotion. Moreover, we also collected people’s responses about the quality and the legibility of the produced expressions by showing them applied to images and a social robot.
Collapse
|
12
|
Systematic Review of Affective Computing Techniques for Infant Robot Interaction. Int J Soc Robot 2023. [DOI: 10.1007/s12369-023-00985-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/09/2023]
Abstract
AbstractResearch studies on social robotics and human-robot interaction have gained insights into factors that influence people’s perceptions and behaviors towards robots. However, adults’ perceptions of robots may differ significantly from those of infants. Consequently, extending this knowledge also to infants’ attitudes toward robots is a growing field of research. Indeed, infant-robot interaction (IRI) is emerging as a critical and necessary area of research as robots are increasingly used in social environments, such as caring for infants with all types of disabilities, companionship, and education. Although studies have been conducted on the ability of robots to positively engage infants, little is known about the infants’ affective state when interacting with a robot. In this systematic review, technologies for infant affective state recognition relevant to IRI applications are presented and surveyed. Indeed, adapting techniques currently employed for infant’s emotion recognition to the field of IRI results to be a complex task, since it requires timely response while not interfering with the infant’s behavior. Those aspects have a crucial impact on the selection of the emotion recognition techniques and the related metrics to be used for this purpose. Therefore, this review is intended to shed light on the advantages and the current research challenges of the infants’ affective state recognition approaches in the IRI field, elucidates a roadmap for their use in forthcoming studies as well as potentially provide support to future developments of emotion-aware robots.
Collapse
|
13
|
Long-Term Exercise Assistance: Group and One-on-One Interactions between a Social Robot and Seniors. ROBOTICS 2023. [DOI: 10.3390/robotics12010009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023] Open
Abstract
For older adults, regular exercises can provide both physical and mental benefits, increase their independence, and reduce the risks of diseases associated with aging. However, only a small portion of older adults regularly engage in physical activity. Therefore, it is important to promote exercise among older adults to help maintain overall health. In this paper, we present the first exploratory long-term human–robot interaction (HRI) study conducted at a local long-term care facility to investigate the benefits of one-on-one and group exercise interactions with an autonomous socially assistive robot and older adults. To provide targeted facilitation, our robot utilizes a unique emotion model that can adapt its assistive behaviors to users’ affect and track their progress towards exercise goals through repeated sessions using the Goal Attainment Scale (GAS), while also monitoring heart rate to prevent overexertion. Results of the study show that users had positive valence and high engagement towards the robot and were able to maintain their exercise performance throughout the study. Questionnaire results showed high robot acceptance for both types of interactions. However, users in the one-on-one sessions perceived the robot as more sociable and intelligent, and had more positive perception of the robot’s appearance and movements.
Collapse
|
14
|
Sun YC, Effati M, Naguib HE, Nejat G. SoftSAR: The New Softer Side of Socially Assistive Robots-Soft Robotics with Social Human-Robot Interaction Skills. SENSORS (BASEL, SWITZERLAND) 2022; 23:432. [PMID: 36617030 PMCID: PMC9824785 DOI: 10.3390/s23010432] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/26/2022] [Revised: 12/20/2022] [Accepted: 12/26/2022] [Indexed: 06/17/2023]
Abstract
When we think of "soft" in terms of socially assistive robots (SARs), it is mainly in reference to the soft outer shells of these robots, ranging from robotic teddy bears to furry robot pets. However, soft robotics is a promising field that has not yet been leveraged by SAR design. Soft robotics is the incorporation of smart materials to achieve biomimetic motions, active deformations, and responsive sensing. By utilizing these distinctive characteristics, a new type of SAR can be developed that has the potential to be safer to interact with, more flexible, and uniquely uses novel interaction modes (colors/shapes) to engage in a heighted human-robot interaction. In this perspective article, we coin this new collaborative research area as SoftSAR. We provide extensive discussions on just how soft robotics can be utilized to positively impact SARs, from their actuation mechanisms to the sensory designs, and how valuable they will be in informing future SAR design and applications. With extensive discussions on the fundamental mechanisms of soft robotic technologies, we outline a number of key SAR research areas that can benefit from using unique soft robotic mechanisms, which will result in the creation of the new field of SoftSAR.
Collapse
Affiliation(s)
- Yu-Chen Sun
- Autonomous Systems and Biomechatronics Laboratory (ASBLab), Department of Mechanical & Industrial Engineering, University of Toronto, Toronto, ON M5S 3G8, Canada
- Toronto Smart Materials and Structures (TSMART), Department of Mechanical & Industrial Engineering, University of Toronto, Toronto, ON M5S 3G8, Canada
| | - Meysam Effati
- Autonomous Systems and Biomechatronics Laboratory (ASBLab), Department of Mechanical & Industrial Engineering, University of Toronto, Toronto, ON M5S 3G8, Canada
| | - Hani E. Naguib
- Toronto Smart Materials and Structures (TSMART), Department of Mechanical & Industrial Engineering, University of Toronto, Toronto, ON M5S 3G8, Canada
- Institute of Biomedical Engineering, University of Toronto, Toronto, ON M5S 3G8, Canada
- Toronto Institute of Advanced Manufacturing (TIAM), University of Toronto, Toronto, ON M5S 3G8, Canada
- Toronto Rehabilitation Institute, Toronto, ON M5G 2A2, Canada
| | - Goldie Nejat
- Autonomous Systems and Biomechatronics Laboratory (ASBLab), Department of Mechanical & Industrial Engineering, University of Toronto, Toronto, ON M5S 3G8, Canada
- Toronto Institute of Advanced Manufacturing (TIAM), University of Toronto, Toronto, ON M5S 3G8, Canada
- Toronto Rehabilitation Institute, Toronto, ON M5G 2A2, Canada
- Rotman Research Institute, Baycrest Health Sciences, North York, ON M6A 2E1, Canada
| |
Collapse
|
15
|
Xue J, Huang Y, Li X, Li J, Zhang P, Kang Z. Emotional Influence of Pupillary Changes of Robots with Different Human-Likeness Levels on Human. Int J Soc Robot 2022. [DOI: 10.1007/s12369-022-00903-z] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/16/2022]
|
16
|
Mukherjee D, Gupta K, Najjaran H. A Critical Analysis of Industrial Human-Robot Communication and Its Quest for Naturalness Through the Lens of Complexity Theory. Front Robot AI 2022; 9:870477. [PMID: 35899077 PMCID: PMC9309351 DOI: 10.3389/frobt.2022.870477] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/06/2022] [Accepted: 06/21/2022] [Indexed: 11/13/2022] Open
Abstract
Human-robot communication is one of the actively researched fields to enable efficient and seamless collaboration between a human and an intelligent industrial robotic system. The field finds its roots in human communication with the aim to achieve the “naturalness” inherent in the latter. Industrial human-robot communication pursues communication with simplistic commands and gestures, which is not representative of an uncontrolled real-world industrial environment. In addition, naturalness in communication is a consequence of its dynamism, typically ignored as a design criterion in industrial human-robot communication. Complexity Theory-based natural communication models allow for a more accurate representation of human communication which, when adapted, could also benefit the field of human-robot communication. This paper presents a perspective by reviewing the state of human-robot communication in industrial settings and then presents a critical analysis of the same through the lens of Complexity Theory. Furthermore, the work identifies research gaps in the aforementioned field, fulfilling which, would propel the field towards a truly natural form of communication. Finally, the work briefly discusses a general framework that leverages the experiential learning of data-based techniques and naturalness of human knowledge.
Collapse
Affiliation(s)
- Debasmita Mukherjee
- Advanced Control and Intelligent Systems Laboratory, School of Engineering, The University of British Columbia, Kelowna, BC, Canada
| | - Kashish Gupta
- Advanced Control and Intelligent Systems Laboratory, Faculty of Engineering, University of Victoria, Victoria, BC, Canada
| | - Homayoun Najjaran
- Advanced Control and Intelligent Systems Laboratory, Faculty of Engineering, University of Victoria, Victoria, BC, Canada
- *Correspondence: Homayoun Najjaran,
| |
Collapse
|
17
|
Avram O, Baraldo S, Valente A. Generalized Behavior Framework for Mobile Robots Teaming With Humans in Harsh Environments. Front Robot AI 2022; 9:898366. [PMID: 35845254 PMCID: PMC9277353 DOI: 10.3389/frobt.2022.898366] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/17/2022] [Accepted: 06/01/2022] [Indexed: 01/09/2023] Open
Abstract
Industrial contexts, typically characterized by highly unstructured environments, where task sequences are difficult to hard-code and unforeseen events occur daily (e.g., oil and gas, energy generation, aeronautics) cannot completely rely upon automation to substitute the human dexterity and judgment skills. Robots operating in these conditions have the common requirement of being able to deploy appropriate behaviours in highly dynamic and unpredictable environments, while aiming to achieve a more natural human-robot interaction and a broad range of acceptability in providing useful and efficient services. The goal of this paper is to introduce a deliberative framework able to acquire, reuse and instantiate a collection of behaviours that promote an extension of the autonomy periods of mobile robotic platforms, with a focus on maintenance, repairing and overhaul applications. Behavior trees are employed to design the robotic system’s high-level deliberative intelligence, which integrates: social behaviors, aiming to capture the human’s emotional state and intention; the ability to either perform or support various process tasks; seamless planning and execution of human-robot shared work plans. In particular, the modularity, reactiveness and deliberation capacity that characterize the behaviour tree formalism are leveraged to interpret the human’s health and cognitive load for supporting her/him, and to complete a shared mission by collaboration or complete take-over. By enabling mobile robotic platforms to take-over risky jobs which the human cannot, should not or do not want to perform the proposed framework bears high potential to significantly improve the safety, productivity and efficiency in harsh working environments.
Collapse
|
18
|
Quiroz M, Patiño R, Diaz-Amado J, Cardinale Y. Group Emotion Detection Based on Social Robot Perception. SENSORS (BASEL, SWITZERLAND) 2022; 22:3749. [PMID: 35632160 PMCID: PMC9145339 DOI: 10.3390/s22103749] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 04/01/2022] [Revised: 05/03/2022] [Accepted: 05/05/2022] [Indexed: 12/16/2022]
Abstract
Social robotics is an emerging area that is becoming present in social spaces, by introducing autonomous social robots. Social robots offer services, perform tasks, and interact with people in such social environments, demanding more efficient and complex Human-Robot Interaction (HRI) designs. A strategy to improve HRI is to provide robots with the capacity of detecting the emotions of the people around them to plan a trajectory, modify their behaviour, and generate an appropriate interaction with people based on the analysed information. However, in social environments in which it is common to find a group of persons, new approaches are needed in order to make robots able to recognise groups of people and the emotion of the groups, which can be also associated with a scene in which the group is participating. Some existing studies are focused on detecting group cohesion and the recognition of group emotions; nevertheless, these works do not focus on performing the recognition tasks from a robocentric perspective, considering the sensory capacity of robots. In this context, a system to recognise scenes in terms of groups of people, to then detect global (prevailing) emotions in a scene, is presented. The approach proposed to visualise and recognise emotions in typical HRI is based on the face size of people recognised by the robot during its navigation (face sizes decrease when the robot moves away from a group of people). On each frame of the video stream of the visual sensor, individual emotions are recognised based on the Visual Geometry Group (VGG) neural network pre-trained to recognise faces (VGGFace); then, to detect the emotion of the frame, individual emotions are aggregated with a fusion method, and consequently, to detect global (prevalent) emotion in the scene (group of people), the emotions of its constituent frames are also aggregated. Additionally, this work proposes a strategy to create datasets with images/videos in order to validate the estimation of emotions in scenes and personal emotions. Both datasets are generated in a simulated environment based on the Robot Operating System (ROS) from videos captured by robots through their sensory capabilities. Tests are performed in two simulated environments in ROS/Gazebo: a museum and a cafeteria. Results show that the accuracy in the detection of individual emotions is 99.79% and the detection of group emotion (scene emotion) in each frame is 90.84% and 89.78% in the cafeteria and the museum scenarios, respectively.
Collapse
Affiliation(s)
- Marco Quiroz
- Electrical and Electronics Engineering Department, School of Electronics and Telecommunications Engineering, Universidad Católica San Pablo, Arequipa 04001, Peru; (M.Q.); (R.P.); (J.D.-A.)
| | - Raquel Patiño
- Electrical and Electronics Engineering Department, School of Electronics and Telecommunications Engineering, Universidad Católica San Pablo, Arequipa 04001, Peru; (M.Q.); (R.P.); (J.D.-A.)
| | - José Diaz-Amado
- Electrical and Electronics Engineering Department, School of Electronics and Telecommunications Engineering, Universidad Católica San Pablo, Arequipa 04001, Peru; (M.Q.); (R.P.); (J.D.-A.)
- Instituto Federal da Bahia, Vitoria da Conquista 45078-300, Brazil
| | - Yudith Cardinale
- Electrical and Electronics Engineering Department, School of Electronics and Telecommunications Engineering, Universidad Católica San Pablo, Arequipa 04001, Peru; (M.Q.); (R.P.); (J.D.-A.)
- Higher School of Engineering, Science and Technology, Universidad Internacional de Valencia, 46002 Valencia, Spain
| |
Collapse
|
19
|
Scarinzi A, Cañamero L. Toward Affective Interactions: E-Motions and Embodied Artificial Cognitive Systems. Front Psychol 2022; 13:768416. [PMID: 35496207 PMCID: PMC9043349 DOI: 10.3389/fpsyg.2022.768416] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Key Words] [Track Full Text] [Download PDF] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/31/2021] [Accepted: 02/25/2022] [Indexed: 11/16/2022] Open
Affiliation(s)
- Alfonsina Scarinzi
- CY AS Institute for Advanced Studies, CY Cergy Paris Université, Cergy, France.,Zentrale Einrichtung fuer Sprachen und Schluesselqualifikationen (ZESS), University of Göttingen, Göttingen, Germany
| | | |
Collapse
|
20
|
Churamani N, Barros P, Gunes H, Wermter S. Affect-Driven Learning of Robot Behaviour for Collaborative Human-Robot Interactions. Front Robot AI 2022; 9:717193. [PMID: 35265672 PMCID: PMC8898942 DOI: 10.3389/frobt.2022.717193] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/30/2021] [Accepted: 01/17/2022] [Indexed: 11/29/2022] Open
Abstract
Collaborative interactions require social robots to share the users’ perspective on the interactions and adapt to the dynamics of their affective behaviour. Yet, current approaches for affective behaviour generation in robots focus on instantaneous perception to generate a one-to-one mapping between observed human expressions and static robot actions. In this paper, we propose a novel framework for affect-driven behaviour generation in social robots. The framework consists of (i) a hybrid neural model for evaluating facial expressions and speech of the users, forming intrinsic affective representations in the robot, (ii) an Affective Core, that employs self-organising neural models to embed behavioural traits like patience and emotional actuation that modulate the robot’s affective appraisal, and (iii) a Reinforcement Learning model that uses the robot’s appraisal to learn interaction behaviour. We investigate the effect of modelling different affective core dispositions on the affective appraisal and use this affective appraisal as the motivation to generate robot behaviours. For evaluation, we conduct a user study (n = 31) where the NICO robot acts as a proposer in the Ultimatum Game. The effect of the robot’s affective core on its negotiation strategy is witnessed by participants, who rank a patient robot with high emotional actuation higher on persistence, while an impatient robot with low emotional actuation is rated higher on its generosity and altruistic behaviour.
Collapse
Affiliation(s)
- Nikhil Churamani
- Department of Computer Science and Technology, University of Cambridge, Cambridge, United Kingdom
- *Correspondence: Nikhil Churamani,
| | - Pablo Barros
- Cognitive Architecture for Collaborative Technologies (CONTACT) Unit, Istituto Italiano di Tecnologia, Genova, Italy
| | - Hatice Gunes
- Department of Computer Science and Technology, University of Cambridge, Cambridge, United Kingdom
| | - Stefan Wermter
- Knowledge Technology, Department of Informatics, University of Hamburg, Hamburg, Germany
| |
Collapse
|
21
|
Advanced Applications of Industrial Robotics: New Trends and Possibilities. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app12010135] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
This review is dedicated to the advanced applications of robotic technologies in the industrial field. Robotic solutions in areas with non-intensive applications are presented, and their implementations are analysed. We also provide an overview of survey publications and technical reports, classified by application criteria, and the development of the structure of existing solutions, and identify recent research gaps. The analysis results reveal the background to the existing obstacles and problems. These issues relate to the areas of psychology, human nature, special artificial intelligence (AI) implementation, and the robot-oriented object design paradigm. Analysis of robot applications shows that the existing emerging applications in robotics face technical and psychological obstacles. The results of this review revealed four directions of required advancement in robotics: development of intelligent companions; improved implementation of AI-based solutions; robot-oriented design of objects; and psychological solutions for robot–human collaboration.
Collapse
|
22
|
Uluer P, Kose H, Gumuslu E, Barkana DE. Experience with an Affective Robot Assistant for Children with Hearing Disabilities. Int J Soc Robot 2021; 15:643-660. [PMID: 34804256 PMCID: PMC8594648 DOI: 10.1007/s12369-021-00830-5] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 09/10/2021] [Indexed: 01/10/2023]
Abstract
This study presents an assistive robotic system enhanced with emotion recognition capabilities for children with hearing disabilities. The system is designed and developed for the audiometry tests and rehabilitation of children in a clinical setting and includes a social humanoid robot (Pepper), an interactive interface, gamified audiometry tests, sensory setup and a machine/deep learning based emotion recognition module. Three scenarios involving conventional setup, tablet setup and setup with the robot+tablet are evaluated with 16 children having cochlear implant or hearing aid. Several machine learning techniques and deep learning models are used for the classification of the three test setups and for the classification of the emotions (pleasant, neutral, unpleasant) of children using the recorded physiological signals by E4 wristband. The results show that the collected signals during the tests can be separated successfully and the positive and negative emotions of children can be better distinguished when they interact with the robot than in the other two setups. In addition, the children’s objective and subjective evaluations as well as their impressions about the robot and its emotional behaviors are analyzed and discussed extensively.
Collapse
Affiliation(s)
- Pinar Uluer
- Department of Computer Engineering, Galatasaray University, Istanbul, Turkey.,Department of AI and Data Engineering, Istanbul Technical University, Istanbul, Turkey
| | - Hatice Kose
- Department of AI and Data Engineering, Istanbul Technical University, Istanbul, Turkey
| | - Elif Gumuslu
- Department of Electrical and Electronics Engineering, Yeditepe University, Istanbul, Turkey
| | - Duygun Erol Barkana
- Department of Electrical and Electronics Engineering, Yeditepe University, Istanbul, Turkey
| |
Collapse
|
23
|
Savela N, Turja T, Latikka R, Oksanen A. Media effects on the perceptions of robots. HUMAN BEHAVIOR AND EMERGING TECHNOLOGIES 2021. [DOI: 10.1002/hbe2.296] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
Affiliation(s)
- Nina Savela
- Faculty of Social Sciences Tampere University Tampere Finland
| | - Tuuli Turja
- Faculty of Social Sciences Tampere University Tampere Finland
| | - Rita Latikka
- Faculty of Social Sciences Tampere University Tampere Finland
| | - Atte Oksanen
- Faculty of Social Sciences Tampere University Tampere Finland
| |
Collapse
|
24
|
Honig S, Oron-Gilad T. Expect the Unexpected: Leveraging the Human-Robot Ecosystem to Handle Unexpected Robot Failures. Front Robot AI 2021; 8:656385. [PMID: 34381819 PMCID: PMC8352555 DOI: 10.3389/frobt.2021.656385] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/20/2021] [Accepted: 06/21/2021] [Indexed: 11/23/2022] Open
Abstract
Unexpected robot failures are inevitable. We propose to leverage socio-technical relations within the human-robot ecosystem to support adaptable strategies for handling unexpected failures. The Theory of Graceful Extensibility is used to understand how characteristics of the ecosystem can influence its ability to respond to unexpected events. By expanding our perspective from Human-Robot Interaction to the Human-Robot Ecosystem, adaptable failure-handling strategies are identified, alongside technical, social and organizational arrangements that are needed to support them. We argue that robotics and HRI communities should pursue more holistic approaches to failure-handling, recognizing the need to embrace the unexpected and consider socio-technical relations within the human robot ecosystem when designing failure-handling strategies.
Collapse
Affiliation(s)
- Shanee Honig
- Department of Industrial Engineering and Management, Mobile Robotics Laboratory and HRI Laboratory, Ben-Gurion University of the Negev, Be'er Sheva, Israel
| | - Tal Oron-Gilad
- Department of Industrial Engineering and Management, Mobile Robotics Laboratory and HRI Laboratory, Ben-Gurion University of the Negev, Be'er Sheva, Israel
| |
Collapse
|
25
|
Emotion-Driven Analysis and Control of Human-Robot Interactions in Collaborative Applications. SENSORS 2021; 21:s21144626. [PMID: 34300366 PMCID: PMC8309492 DOI: 10.3390/s21144626] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/27/2021] [Revised: 06/27/2021] [Accepted: 07/01/2021] [Indexed: 11/23/2022]
Abstract
The utilization of robotic systems has been increasing in the last decade. This increase has been derived by the evolvement in the computational capabilities, communication systems, and the information systems of the manufacturing systems which is reflected in the concept of Industry 4.0. Furthermore, the robotics systems are continuously required to address new challenges in the industrial and manufacturing domain, like keeping humans in the loop, among other challenges. Briefly, the keeping humans in the loop concept focuses on closing the gap between humans and machines by introducing a safe and trustworthy environment for the human workers to work side by side with robots and machines. It aims at increasing the engagement of the human as the automation level increases rather than replacing the human, which can be nearly impossible in some applications. Consequently, the collaborative robots (Cobots) have been created to allow physical interaction with the human worker. However, these cobots still lack of recognizing the human emotional state. In this regard, this paper presents an approach for adapting cobot parameters to the emotional state of the human worker. The approach utilizes the Electroencephalography (EEG) technology for digitizing and understanding the human emotional state. Afterwards, the parameters of the cobot are instantly adjusted to keep the human emotional state in a desirable range which increases the confidence and the trust between the human and the cobot. In addition, the paper includes a review on technologies and methods for emotional sensing and recognition. Finally, this approach is tested on an ABB YuMi cobot with commercially available EEG headset.
Collapse
|