1
|
Valori I, Fan Y, Jung MM, Fairhurst MT. Propensity to trust shapes perceptions of comforting touch between trustworthy human and robot partners. Sci Rep 2024; 14:6747. [PMID: 38514732 PMCID: PMC10957953 DOI: 10.1038/s41598-024-57582-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/11/2023] [Accepted: 03/19/2024] [Indexed: 03/23/2024] Open
Abstract
Touching a friend to comfort or be comforted is a common prosocial behaviour, firmly based in mutual trust. Emphasising the interactive nature of trust and touch, we suggest that vulnerability, reciprocity and individual differences shape trust and perceptions of touch. We further investigate whether these elements also apply to companion robots. Participants (n = 152) were exposed to four comics depicting human-human or human-robot exchanges. Across conditions, one character was sad, the other initiated touch to comfort them, and the touchee reciprocated the touch. Participants first rated trustworthiness of a certain character (human or robot in a vulnerable or comforting role), then evaluated the two touch phases (initiation and reciprocity) in terms of interaction realism, touch appropriateness and pleasantness, affective state (valence and arousal) attributed to the characters. Results support an interactive account of trust and touch, with humans being equally trustworthy when comforting or showing vulnerability, and reciprocity of touch buffering sadness. Although these phenomena seem unique to humans, propensity to trust technology reduces the gap between how humans and robots are perceived. Two distinct trust systems emerge: one for human interactions and another for social technologies, both necessitating trust as a fundamental prerequisite for meaningful physical contact.
Collapse
Affiliation(s)
- Irene Valori
- Chair of Acoustics and Haptics, Technische Universität Dresden, Dresden, Germany.
- Centre for Tactile Internet with Human-in-the-Loop (CeTI), Technische Universität Dresden, Dresden, Germany.
| | - Yichen Fan
- Chair of Industrial Design Engineering, Technische Universität Dresden, Dresden, Germany
- 6G-Life, Dresden, Germany
| | - Merel M Jung
- Department of Cognitive Science and Artificial Intelligence, Tilburg University, Tilburg, The Netherlands
| | - Merle T Fairhurst
- Chair of Acoustics and Haptics, Technische Universität Dresden, Dresden, Germany
- Centre for Tactile Internet with Human-in-the-Loop (CeTI), Technische Universität Dresden, Dresden, Germany
- 6G-Life, Dresden, Germany
| |
Collapse
|
2
|
Abdulazeem N, Hu Y. Human Factors Considerations for Quantifiable Human States in Physical Human-Robot Interaction: A Literature Review. SENSORS (BASEL, SWITZERLAND) 2023; 23:7381. [PMID: 37687837 PMCID: PMC10490212 DOI: 10.3390/s23177381] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/01/2023] [Revised: 08/11/2023] [Accepted: 08/16/2023] [Indexed: 09/10/2023]
Abstract
As the global population rapidly ages with longer life expectancy and declining birth rates, the need for healthcare services and caregivers for older adults is increasing. Current research envisions addressing this shortage by introducing domestic service robots to assist with daily activities. The successful integration of robots as domestic service providers in our lives requires them to possess efficient manipulation capabilities, provide effective physical assistance, and have adaptive control frameworks that enable them to develop social understanding during human-robot interaction. In this context, human factors, especially quantifiable ones, represent a necessary component. The objective of this paper is to conduct an unbiased review encompassing the studies on human factors studied in research involving physical interactions and strong manipulation capabilities. We identified the prevalent human factors in physical human-robot interaction (pHRI), noted the factors typically addressed together, and determined the frequently utilized assessment approaches. Additionally, we gathered and categorized proposed quantification approaches based on the measurable data for each human factor. We also formed a map of the common contexts and applications addressed in pHRI for a comprehensive understanding and easier navigation of the field. We found out that most of the studies in direct pHRI (when there is direct physical contact) focus on social behaviors with belief being the most commonly addressed human factor type. Task collaboration is moderately investigated, while physical assistance is rarely studied. In contrast, indirect pHRI studies (when the physical contact is mediated via a third item) often involve industrial settings, with physical ergonomics being the most frequently investigated human factor. More research is needed on the human factors in direct and indirect physical assistance applications, including studies that combine physical social behaviors with physical assistance tasks. We also found that while the predominant approach in most studies involves the use of questionnaires as the main method of quantification, there is a recent trend that seeks to address the quantification approaches based on measurable data.
Collapse
Affiliation(s)
| | - Yue Hu
- Active & Interactive Robotics Lab, Department of Mechanical and Mechatronics Engineering, University of Waterloo, 200 University Ave. W., Waterloo, ON N2L 3G1, Canada;
| |
Collapse
|
3
|
Pilacinski A, Pinto A, Oliveira S, Araújo E, Carvalho C, Silva PA, Matias R, Menezes P, Sousa S. The robot eyes don't have it. The presence of eyes on collaborative robots yields marginally higher user trust but lower performance. Heliyon 2023; 9:e18164. [PMID: 37520993 PMCID: PMC10382291 DOI: 10.1016/j.heliyon.2023.e18164] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/02/2023] [Revised: 06/21/2023] [Accepted: 07/10/2023] [Indexed: 08/01/2023] Open
Abstract
Eye gaze is a prominent feature of human social lives, but little is known on whether fitting eyes on machines makes humans trust them more. In this study we compared subjective and objective markers of human trust when collaborating with eyed and non-eyed robots of the same type. We used virtual reality scenes in which we manipulated distance and the presence of eyes on a robot's display during simple collaboration scenes. We found that while collaboration with eyed cobots resulted in slightly higher subjective trust ratings, the objective markers such as pupil size and task completion time indicated it was in fact less comfortable to collaborate with eyed robots. These findings are in line with recent suggestions that anthropomorphism may be actually a detrimental feature of collaborative robots. These findings also show the complex relationship between human objective and subjective markers of trust when collaborating with artificial agents.
Collapse
Affiliation(s)
- Artur Pilacinski
- Medical Faculty, Ruhr University Bochum, Bochum, Germany
- CINEICC - Center for Research in Neuropsychology and Cognitive Behavioral Intervention, University of Coimbra, Coimbra, Portugal
- Faculty of Psychology and Educational Sciences, University of Coimbra, Coimbra, Portugal
| | - Ana Pinto
- Faculty of Psychology and Educational Sciences, University of Coimbra, Coimbra, Portugal
- Faculty of Sciences and Technology, University of Coimbra, Coimbra, Portugal
- CeBER – Centre for Business and Economics Research, University of Coimbra, Coimbra, Portugal
| | - Soraia Oliveira
- Faculty of Psychology and Educational Sciences, University of Coimbra, Coimbra, Portugal
| | - Eduardo Araújo
- Faculty of Sciences and Technology, University of Coimbra, Coimbra, Portugal
- Department of Informatics Engineering, University of Coimbra, Coimbra, Portugal
| | - Carla Carvalho
- CINEICC - Center for Research in Neuropsychology and Cognitive Behavioral Intervention, University of Coimbra, Coimbra, Portugal
- Faculty of Psychology and Educational Sciences, University of Coimbra, Coimbra, Portugal
| | - Paula Alexandra Silva
- Faculty of Sciences and Technology, University of Coimbra, Coimbra, Portugal
- Department of Informatics Engineering, University of Coimbra, Coimbra, Portugal
- CISUC - Centre for Informatics and Systems of the University of Coimbra, Coimbra, Portugal
| | - Ricardo Matias
- Faculty of Sciences and Technology, University of Coimbra, Coimbra, Portugal
- Electrical and Computer Engineering Department, University of Coimbra, Coimbra, Portugal
| | - Paulo Menezes
- Faculty of Sciences and Technology, University of Coimbra, Coimbra, Portugal
- Electrical and Computer Engineering Department, University of Coimbra, Coimbra, Portugal
| | - Sonia Sousa
- University of Trás-os-Montes e Alto Douro, Vila Real, Portugal
- School of Digital Technologies, Tallinn University, Tallinn, Estonia
| |
Collapse
|
4
|
Improving HRI with Force Sensing. MACHINES 2021. [DOI: 10.3390/machines10010015] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
In the future, in a society where robots and humans live together, HRI is an important field of research. While most human–robot-interaction (HRI) studies focus on appearance and dialogue, touch-communication has not been the focus of many studies despite the importance of its role in human–human communication. This paper investigates how and where humans touch an inorganic non-zoomorphic robot arm. Based on these results, we install touch sensors on the robot arm and conduct experiments to collect data of users’ impressions towards the robot when touching it. Our results suggest two main things. First, the touch gestures were collected with two sensors, and the collected data can be analyzed using machine learning to classify the gestures. Second, communication between humans and robots using touch can improve the user’s impression of the robots.
Collapse
|