1
|
Wilt H, Wu Y, Trotter A, Adank P. Automatic imitation is modulated by stimulus clarity but not by animacy. Atten Percept Psychophys 2024; 86:2078-2092. [PMID: 39085716 PMCID: PMC11411005 DOI: 10.3758/s13414-024-02935-1] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Accepted: 07/14/2024] [Indexed: 08/02/2024]
Abstract
Observing actions evokes an automatic imitative response that activates mechanisms required to execute these actions. Automatic imitation is measured using the Stimulus Response Compatibility (SRC) task, which presents participants with compatible and incompatible prompt-distractor pairs. Automatic imitation, or the compatibility effect, is the difference in response times (RTs) between incompatible and compatible trials. Past results suggest that an action's animacy affects automatic imitation: human-produced actions evoke larger effects than computer-generated actions. However, it appears that animacy effects occur mostly when non-human stimuli are less complex or less clear. Theoretical accounts make conflicting predictions regarding both stimulus manipulations. We conducted two SRC experiments that presented participants with an animacy manipulation (human and computer-generated stimuli, Experiment 1) and a clarity manipulation (stimuli with varying visual clarity using Gaussian blurring, Experiments 1 and 2) to tease apart effect of these manipulations. Participants in Experiment 1 responded slower for incompatible than for compatible trials, showing a compatibility effect. Experiment 1 found a null effect of animacy, but stimuli with lower visual clarity evoked smaller compatibility effects. Experiment 2 modulated clarity in five steps and reports decreasing compatibility effects for stimuli with lower clarity. Clarity, but not animacy, therefore affected automatic imitation, and theoretical implications and future directions are considered.
Collapse
Affiliation(s)
- Hannah Wilt
- Department of Speech, Hearing and Phonetic Sciences, University College London, London, WC1N 1PF, UK
| | - Yuchunzi Wu
- Department of Neural and Cognitive Sciences, New York University Shanghai, Shanghai, China
- NYU-ECNU Institute of Brain and Cognitive Sciences at New York University Shanghai, Shanghai, China
| | - Antony Trotter
- Institute of Psychiatry, Psychology & Neuroscience, King's College London, London, UK
| | - Patti Adank
- Department of Speech, Hearing and Phonetic Sciences, University College London, London, WC1N 1PF, UK.
| |
Collapse
|
2
|
Zheng X, Han Y, Liang J. Anthropomorphic motion planning for multi-degree-of-freedom arms. Front Bioeng Biotechnol 2024; 12:1388609. [PMID: 38863490 PMCID: PMC11165200 DOI: 10.3389/fbioe.2024.1388609] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/20/2024] [Accepted: 05/13/2024] [Indexed: 06/13/2024] Open
Abstract
With the development of technology, the humanoid robot is no longer a concept, but a practical partner with the potential to assist people in industry, healthcare and other daily scenarios. The basis for the success of humanoid robots is not only their appearance, but more importantly their anthropomorphic behaviors, which is crucial for the human-robot interaction. Conventionally, robots are designed to follow meticulously calculated and planned trajectories, which typically rely on predefined algorithms and models, resulting in the inadaptability to unknown environments. Especially when faced with the increasing demand for personalized and customized services, predefined motion planning cannot be adapted in time to adapt to personal behavior. To solve this problem, anthropomorphic motion planning has become the focus of recent research with advances in biomechanics, neurophysiology, and exercise physiology which deepened the understanding of the body for generating and controlling movement. However, there is still no consensus on the criteria by which anthropomorphic motion is accurately generated and how to generate anthropomorphic motion. Although there are articles that provide an overview of anthropomorphic motion planning such as sampling-based, optimization-based, mimicry-based, and other methods, these methods differ only in the nature of the planning algorithms and have not yet been systematically discussed in terms of the basis for extracting upper limb motion characteristics. To better address the problem of anthropomorphic motion planning, the key milestones and most recent literature have been collated and summarized, and three crucial topics are proposed to achieve anthropomorphic motion, which are motion redundancy, motion variation, and motion coordination. The three characteristics are interrelated and interdependent, posing the challenge for anthropomorphic motion planning system. To provide some insights for the research on anthropomorphic motion planning, and improve the anthropomorphic motion ability, this article proposes a new taxonomy based on physiology, and a more complete system of anthropomorphic motion planning by providing a detailed overview of the existing methods and their contributions.
Collapse
Affiliation(s)
- Xiongfei Zheng
- State Key Laboratory of Intelligent Manufacturing Equipment and Technology, Huazhong University of Science and Technology, Wuhan, China
| | - Yunyun Han
- Department of Neurobiology, School of Basic Medicine, Tongji Medical College, Huazhong University of Science and Technology, Wuhan, China
| | - Jiejunyi Liang
- State Key Laboratory of Intelligent Manufacturing Equipment and Technology, Huazhong University of Science and Technology, Wuhan, China
| |
Collapse
|
3
|
Meneses A, Mahzoon H, Yoshikawa Y, Ishiguro H. Multiple Groups of Agents for Increased Movement Interference and Synchronization. SENSORS (BASEL, SWITZERLAND) 2022; 22:5465. [PMID: 35891144 PMCID: PMC9317759 DOI: 10.3390/s22145465] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/20/2022] [Revised: 07/13/2022] [Accepted: 07/19/2022] [Indexed: 06/15/2023]
Abstract
We examined the influence of groups of agents and the type of avatar on movement interference. In addition, we studied the synchronization of the subject with the agent. For that, we conducted experiments utilizing human subjects to examine the influence of one, two, or three agents, as well as human or robot avatars, and finally, the agent moving biologically or linearly. We found the main effect on movement interference was the number of agents; namely, three agents had significantly more influence on movement interference than one agent. These results suggest that the number of agents is more influential on movement interference than other avatar characteristics. For the synchronization, the main effect of the type of the agent was revealed, showing that the human agent kept more synchronization compared to the robotic agent. In this experiment, we introduced an additional paradigm on the interference which we called synchronization, discovering that a group of agents is able to influence this behavioral level as well.
Collapse
Affiliation(s)
- Alexis Meneses
- Graduate School of Engineering Science, Osaka University, Toyonaka 560-8531, Japan; (Y.Y.); (H.I.)
| | - Hamed Mahzoon
- Institute for Open and Transdisciplinary Research Initiatives (OTRI), Osaka University, Suita 565-0871, Japan;
| | - Yuichiro Yoshikawa
- Graduate School of Engineering Science, Osaka University, Toyonaka 560-8531, Japan; (Y.Y.); (H.I.)
| | - Hiroshi Ishiguro
- Graduate School of Engineering Science, Osaka University, Toyonaka 560-8531, Japan; (Y.Y.); (H.I.)
| |
Collapse
|
4
|
Kammler-Sucker KI, Loffler A, Kleinbohl D, Flor H. Exploring Virtual Doppelgangers as Movement Models to Enhance Voluntary Imitation. IEEE Trans Neural Syst Rehabil Eng 2021; 29:2173-2182. [PMID: 34653005 DOI: 10.1109/tnsre.2021.3120795] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Virtual Reality (VR) setups offer the possibility to investigate interactions between model and observer characteristics in imitation behavior, such as in the chameleon effect of automatic mimicry. We tested the hypothesis that perceived affiliative characteristics of a virtual model, such as similarity to the observer and likability, will facilitate observers' engagement in voluntary motor imitation. In a within-subjects design, participants were exposed to four virtual characters of different degrees of realism and observer similarity (avatar numbers AN=1-4), ranging from an abstract stickperson to a personalized doppelganger avatar designed from 3d scans of the observer. The characters performed different trunk movements and participants were asked to imitate these. We defined functional ranges of motion (ROM) for spinal extension (bending backward, BB), lateral flexion (bending sideward, BS) and rotation in the horizontal plane (RH) based on shoulder marker trajectories as behavioral indicators of imitation. Participants' ratings on avatar appearance, characteristics and embodiment/ enfacement were recorded in an Autonomous Avatar Questionnaire (AAQ), factorized into three sum scales based on our explorative analysis. Linear mixed effects models revealed that for lateral flexion (BS), a facilitating influence of avatar type on ROM was mediated by perceived identificatory avatar properties such as avatar likability, avatar-observer-similarity and other affiliative characteristics (AAQ1). This suggests that maximization of model-observer similarity with a virtual doppelganger may be useful in observational modeling and this could be used to modify maladaptive motor behaviors in patients with chronic back pain.
Collapse
|
5
|
Gulletta G, Silva ECE, Erlhagen W, Meulenbroek R, Costa MFP, Bicho E. A Human-like Upper-limb Motion Planner: Generating naturalistic movements for humanoid robots. INT J ADV ROBOT SYST 2021. [DOI: 10.1177/1729881421998585] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022] Open
Abstract
As robots are starting to become part of our daily lives, they must be able to cooperate in a natural and efficient manner with humans to be socially accepted. Human-like morphology and motion are often considered key features for intuitive human–robot interactions because they allow human peers to easily predict the final intention of a robotic movement. Here, we present a novel motion planning algorithm, the Human-like Upper-limb Motion Planner, for the upper limb of anthropomorphic robots, that generates collision-free trajectories with human-like characteristics. Mainly inspired from established theories of human motor control, the planning process takes into account a task-dependent hierarchy of spatial and postural constraints modelled as cost functions. For experimental validation, we generate arm-hand trajectories in a series of tasks including simple point-to-point reaching movements and sequential object-manipulation paradigms. Being a major contribution to the current literature, specific focus is on the kinematics of naturalistic arm movements during the avoidance of obstacles. To evaluate human-likeness, we observe kinematic regularities and adopt smoothness measures that are applied in human motor control studies to distinguish between well-coordinated and impaired movements. The results of this study show that the proposed algorithm is capable of planning arm-hand movements with human-like kinematic features at a computational cost that allows fluent and efficient human–robot interactions.
Collapse
Affiliation(s)
- Gianpaolo Gulletta
- Centre Algoritmi, Department of Industrial Electronics, University of Minho, Braga, Portugal
| | | | - Wolfram Erlhagen
- Centre of Mathematics, Department of Mathematics and Applications, University of Minho, Braga, Portugal
| | - Ruud Meulenbroek
- Donders Institute for Brain, Cognition and Behaviour, Radboud University, Nijmegen, the Netherlands
| | | | - Estela Bicho
- Centre Algoritmi, Department of Industrial Electronics, University of Minho, Braga, Portugal
| |
Collapse
|
6
|
Sueur C, Forin-Wiart MA, Pelé M. Are They Really Trying to Save Their Buddy? The Anthropomorphism of Animal Epimeletic Behaviours. Animals (Basel) 2020; 10:ani10122323. [PMID: 33297457 PMCID: PMC7762333 DOI: 10.3390/ani10122323] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/26/2020] [Revised: 11/26/2020] [Accepted: 12/04/2020] [Indexed: 01/07/2023] Open
Abstract
Simple Summary Anthropomorphism, defined as attributing human traits to animals and other entities, seems to have appeared during evolution to improve an individual’s understanding of other species (or indeed the world in general). Yet anthropomorphism can have beneficial or harmful consequences especially for animals, and there seems to be little interest in monitoring the potential danger of this approach. Few studies have focused on the factors affecting how we attribute intentions or beliefs to animals, and more quantitative studies are needed to identify how and why humans attribute mental states and cognitive abilities to other animals. In this study, participants answer questions about three videos in which an individual (a sparrow, an elephant and a macaque, respectively) displayed behaviours towards an inanimate conspecific that suddenly regained consciousness at the end of the footage. A fourth video showed a robot dog being kicked by an engineer to demonstrate its stability. These questions were designed to measure how far participants attribute humanlike intentions, beliefs or mental states to non-human animals and robots. Men and older participants are less likely to attribute humanlike mental states to animals. Similarly, people who work with animals or have at least one pet at home demonstrated less naïve anthropomorphism. Conversely, we found that members of animal protection associations showed more biophilia (affinity for other living organisms), attributed more intentions and mental states to animals and were further from biological reality (current scientific knowledge of each species) than non-members. Understanding the potential usefulness of these factors can lead to better relationships with animals and encourage human-robot interactions. Abstract Anthropomorphism is a natural tendency in humans, but it is also influenced by many characteristics of the observer (the human) and the observed entity (here, the animal species). This study asked participants to complete an online questionnaire about three videos showing epimeletic behaviours in three animal species. In the videos, an individual (a sparrow, an elephant and a macaque, respectively) displayed behaviours towards an inanimate conspecific that suddenly regained consciousness at the end of the footage. A fourth video showed a robot dog being kicked by an engineer to demonstrate its stability. Each video was followed by a series of questions designed to evaluate the degree of anthropomorphism of participants, from mentaphobia (no attribution of intentions and beliefs, whatever the animal species) to full anthropomorphism (full attribution of intentions and beliefs by animals, to the same extent as in humans) and to measure how far the participants had correctly assessed each situation in terms of biological reality (current scientific knowledge of each species). There is a negative correlation (about 61%) between the mental states attributed to animals by humans and the real capability of animals. The heterogeneity of responses proved that humans display different forms of anthropomorphism, from rejecting all emotional or intentional states in animals to considering animals to show the same intentions as humans. However, the scores participants attributed to animals differed according to the species shown in the video and to human socio-demographic characteristics. Understanding the potential usefulness of these factors can lead to better relationships with animals and encourage a positive view of human-robot interactions. Indeed, reflective or critical anthropomorphism can increase our humanity.
Collapse
Affiliation(s)
- Cédric Sueur
- Université de Strasbourg, CNRS, IPHC UMR 7178, F-67000 Strasbourg, France;
- Centre Européen d’Enseignement et de Recherche en Éthique, F-67000 Strasbourg, France
- Institut Universitaire de France, 75006 Paris, France
- Correspondence: ; Tel.: +33(0)3-8810-7453
| | | | - Marie Pelé
- Anthropo-Lab, ETHICS EA7446, Lille Catholic University, 59000 Lille, France;
| |
Collapse
|
7
|
Abstract
In the last decade, the objectives outlined by the needs of personal robotics have led to the rise of new biologically-inspired techniques for arm motion planning. This paper presents a literature review of the most recent research on the generation of human-like arm movements in humanoid and manipulation robotic systems. Search methods and inclusion criteria are described. The studies are analyzed taking into consideration the sources of publication, the experimental settings, the type of movements, the technical approach, and the human motor principles that have been used to inspire and assess human-likeness. Results show that there is a strong focus on the generation of single-arm reaching movements and biomimetic-based methods. However, there has been poor attention to manipulation, obstacle-avoidance mechanisms, and dual-arm motion generation. For these reasons, human-like arm motion generation may not fully respect human behavioral and neurological key features and may result restricted to specific tasks of human-robot interaction. Limitations and challenges are discussed to provide meaningful directions for future investigations.
Collapse
|
8
|
Abstract
As the field of social robotics has been dynamically growing and expanding over various areas of research and application, in which robots can be of assistance and companionship for humans, this paper offers a different perspective on a role that social robots can also play, namely the role of informing us about flexibility of human mechanisms of social cognition. The paper focuses on studies in which robots have been used as a new type of "stimuli" in psychological experiments to examine whether similar mechanisms of social cognition would be activated in interaction with a robot, as would be elicited in interaction with another human. Analysing studies in which a direct comparison has been made between a robot and a human agent, the paper examines whether for robot agents, the brain re-uses the same mechanisms that have been developed for interaction with other humans in terms of perception, action representation, attention and higher-order social cognition. Based on this analysis, the paper concludes that the human socio-cognitive mechanisms, in adult brains, are sufficiently flexible to be re-used for robotic agents, at least for those that have some level of resemblance to humans.
Collapse
|
9
|
Förster F, Dautenhahn K, Nehaniv CL. Toward Scalable Measures of Quality of Interaction. ACM TRANSACTIONS ON HUMAN-ROBOT INTERACTION 2020. [DOI: 10.1145/3344277] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/25/2022]
Abstract
Motor resonance, the activation of an observer’s motor control system by another actor’s movements, has been claimed to be an indicator for quality of interaction. Motor interference as one of the consequences of the presence of resonance can be detected by analyzing an actor’s spatial movements. It has therefore been used as an indicator for the presence of motor resonance. Unfortunately, the experimental paradigm in which motor interference has been shown to be detectable is ecologically implausible both in terms of the types of movements employed and the number of repetitions required. In the presented experiment, we tested whether some of these experimental constraints can be relaxed or modified toward a more naturalistic behavior without losing the ability to detect the interference effect. In the literature, spatial variance has been analytically quantified in many different ways. This study found these analytical variations to be nonequivalent by implementing them. Back-and-forth transitive movements were tested for motor interference; the effect was found to be more robust than with left-right movements, although the direction of interference was opposite to that reported in the literature. We conclude that motor interference, when measured by spatial variation, lacks promise for embedding in naturalistic interaction scenarios because the effect sizes were small.
Collapse
|
10
|
Seeing minds in others: Mind perception modulates low-level social-cognitive performance and relates to ventromedial prefrontal structures. COGNITIVE AFFECTIVE & BEHAVIORAL NEUROSCIENCE 2019; 18:837-856. [PMID: 29992485 DOI: 10.3758/s13415-018-0608-2] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Abstract
In social interactions, we rely on nonverbal cues like gaze direction to understand the behavior of others. How we react to these cues is affected by whether they are believed to originate from an entity with a mind, capable of having internal states (i.e., mind perception). While prior work has established a set of neural regions linked to social-cognitive processes like mind perception, the degree to which activation within this network relates to performance in subsequent social-cognitive tasks remains unclear. In the current study, participants performed a mind perception task (i.e., judging the likelihood that faces, varying in physical human-likeness, have internal states) while event-related fMRI was collected. Afterwards, participants performed a social attention task outside the scanner, during which they were cued by the gaze of the same faces that they previously judged within the mind perception task. Parametric analyses of the fMRI data revealed that activity within ventromedial prefrontal cortex (vmPFC) was related to both mind ratings inside the scanner and gaze-cueing performance outside the scanner. In addition, other social brain regions were related to gaze-cueing performance, including frontal areas like the left insula, dorsolateral prefrontal cortex, and inferior frontal gyrus, as well as temporal areas like the left temporo-parietal junction and bilateral temporal gyri. The findings suggest that functions subserved by the vmPFC are relevant to both mind perception and social attention, implicating a role of vmPFC in the top-down modulation of low-level social-cognitive processes.
Collapse
|
11
|
Gandolfo M, Era V, Tieri G, Sacheli LM, Candidi M. Interactor's body shape does not affect visuo-motor interference effects during motor coordination. Acta Psychol (Amst) 2019; 196:42-50. [PMID: 30986565 DOI: 10.1016/j.actpsy.2019.04.003] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2018] [Revised: 04/01/2019] [Accepted: 04/03/2019] [Indexed: 01/05/2023] Open
Abstract
The biological-tuning of the Action Observation Network is highly debated. A current open question relates to whether the morphological appearance (body shape) and/or the biological motion of the observed agent triggers action simulation processes. Motor simulation of the partner's action is critical for motor interactions, where two partners coordinate their actions in space and time. It supports interpersonal alignment and facilitates online coordination. However, motor simulation also leads to visuo-motor interference effects when people are required to coordinate with complementary actions, i.e. incongruent movements as compared to the observed ones. Movement kinematics of interactive partners allows us to capture their automatic tendency to simulate and imitate the partner's complementary movements. In an online reach-to-grasp task, we investigated whether visuo-motor interference effects, visible in the kinematics of complementary movements, are modulated by the visual presence of the interactor's body shape. We asked participants to interact with 1) a humanoid agent with a human-like body shape and with real human, biological, movement kinematics, or 2) a non-humanoid agent, which did not resemble the human body-shape but moved with the same real kinematics. Through the combination of inferential and Bayesian statistics, the results show no effect of interactor's body shape on visuo-motor interference in reaching and grasping kinematics during online motor coordination. We discuss the results and propose that the kinematics of the observed movements, during motor interactions, might be the key factor for visuo-motor interference to take place independently from the morphological appearance of the partner. This is particularly relevant in a technological society that constantly asks humans to interact with artificial agents.
Collapse
|
12
|
Does watching Han Solo or C-3PO similarly influence our language processing? PSYCHOLOGICAL RESEARCH 2019; 84:1572-1585. [PMID: 30931488 DOI: 10.1007/s00426-019-01169-3] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 08/02/2018] [Accepted: 03/19/2019] [Indexed: 12/14/2022]
Abstract
Several studies have demonstrated that perceiving an action influences the subsequent processing of action verbs. However, which characteristics of the perceived action are truly determinant to enable this influence is still unknown. The current study investigated the role of the agent executing an action in this action-language relationship. Participants performed a semantic decision task after seeing a video of a human or a robot performing an action. The results of the first study showed that perceiving a human being executing an action as well as perceiving a robot facilitate subsequent language processing, suggesting that the humanness (The term "humanness" is used as meaning "belonging to human race" and not to refer to a personal quality) of the agent is not crucial in the link between action and language. However, this experiment was conducted with Japanese people who are very familiar with robots; thus, an alternative explanation could be that it is the unfamiliarity with the agent that could perturb the action-language relationship. To assess this hypothesis, we carried out two additional experiments with French participants. The results of the second study showed that, unlike the observation of a human agent, the observation of a robot did not influence language processing. Finally, the results of the third study showed that, after a familiarization phase, French participants too were influenced by the observation of a robot. Overall, the outcomes of these studies indicate that, more than the humanness of the agent, it is the familiarity which we have with this agent that is crucial in the action-language relationship.
Collapse
|
13
|
Hortensius R, Hekele F, Cross ES. The Perception of Emotion in Artificial Agents. IEEE Trans Cogn Dev Syst 2018. [DOI: 10.1109/tcds.2018.2826921] [Citation(s) in RCA: 54] [Impact Index Per Article: 9.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
14
|
More than just co-workers: Presence of humanoid robot co-worker influences human performance. PLoS One 2018; 13:e0206698. [PMID: 30408062 PMCID: PMC6224070 DOI: 10.1371/journal.pone.0206698] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/10/2018] [Accepted: 10/17/2018] [Indexed: 11/19/2022] Open
Abstract
Does the presence of a robot co-worker influence the performance of humans around it? Studies of motor contagions during human-robot interactions have examined either how the observation of a robot affects a human's movement velocity, or how it affects the human's movement variance, but never both together. Performance however, has to be measured considering both task speed (or frequency) as well as task accuracy. Here we examine an empirical repetitive industrial task in which a human participant and a humanoid robot work near each other. We systematically varied the robot behavior, and observed whether and how the performance of a human participant is affected by the presence of the robot. To investigate the effect of physical form, we added conditions where the robot co-worker torso and head were covered, and only the moving arm was visible to the human participants. Finally, we compared these behaviors with a human co-worker, and examined how the observed behavioral affects scale with experience of robots. Our results show that human task frequency, but not task accuracy, is affected by the observation of a humanoid robot co-worker, provided the robot's head and torso are visible.
Collapse
|
15
|
Adaptive changes in automatic motor responses based on acquired visuomotor correspondence. Exp Brain Res 2018; 237:147-159. [DOI: 10.1007/s00221-018-5409-x] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/17/2018] [Accepted: 10/19/2018] [Indexed: 10/28/2022]
|
16
|
Hortensius R, Cross ES. From automata to animate beings: the scope and limits of attributing socialness to artificial agents. Ann N Y Acad Sci 2018; 1426:93-110. [PMID: 29749634 DOI: 10.1111/nyas.13727] [Citation(s) in RCA: 37] [Impact Index Per Article: 6.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/01/2017] [Revised: 03/16/2018] [Accepted: 03/21/2018] [Indexed: 12/29/2022]
Abstract
Understanding the mechanisms and consequences of attributing socialness to artificial agents has important implications for how we can use technology to lead more productive and fulfilling lives. Here, we integrate recent findings on the factors that shape behavioral and brain mechanisms that support social interactions between humans and artificial agents. We review how visual features of an agent, as well as knowledge factors within the human observer, shape attributions across dimensions of socialness. We explore how anthropomorphism and dehumanization further influence how we perceive and interact with artificial agents. Based on these findings, we argue that the cognitive reconstruction within the human observer is likely to be far more crucial in shaping our interactions with artificial agents than previously thought, while the artificial agent's visual features are possibly of lesser importance. We combine these findings to provide an integrative theoretical account based on the "like me" hypothesis, and discuss the key role played by the Theory-of-Mind network, especially the temporal parietal junction, in the shift from mechanistic to social attributions. We conclude by highlighting outstanding questions on the impact of long-term interactions with artificial agents on the behavioral and brain mechanisms of attributing socialness to these agents.
Collapse
Affiliation(s)
- Ruud Hortensius
- Wales Institute for Cognitive Neuroscience, School of Psychology, Bangor University, Wales, United Kingdom
- Institute of Neuroscience and Psychology, School of Psychology, University of Glasgow, Scotland, United Kingdom
| | - Emily S Cross
- Wales Institute for Cognitive Neuroscience, School of Psychology, Bangor University, Wales, United Kingdom
- Institute of Neuroscience and Psychology, School of Psychology, University of Glasgow, Scotland, United Kingdom
| |
Collapse
|
17
|
Itaguchi Y, Kaneko F. Motor priming by movement observation with contralateral concurrent action execution. Hum Mov Sci 2018; 57:94-102. [DOI: 10.1016/j.humov.2017.11.007] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/11/2017] [Revised: 11/13/2017] [Accepted: 11/14/2017] [Indexed: 10/18/2022]
|
18
|
Kupferberg A, Iacoboni M, Flanagin V, Huber M, Kasparbauer A, Baumgartner T, Hasler G, Schmidt F, Borst C, Glasauer S. Fronto-parietal coding of goal-directed actions performed by artificial agents. Hum Brain Mapp 2017; 39:1145-1162. [PMID: 29205671 DOI: 10.1002/hbm.23905] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/07/2017] [Revised: 11/17/2017] [Accepted: 11/22/2017] [Indexed: 11/11/2022] Open
Abstract
With advances in technology, artificial agents such as humanoid robots will soon become a part of our daily lives. For safe and intuitive collaboration, it is important to understand the goals behind their motor actions. In humans, this process is mediated by changes in activity in fronto-parietal brain areas. The extent to which these areas are activated when observing artificial agents indicates the naturalness and easiness of interaction. Previous studies indicated that fronto-parietal activity does not depend on whether the agent is human or artificial. However, it is unknown whether this activity is modulated by observing grasping (self-related action) and pointing actions (other-related action) performed by an artificial agent depending on the action goal. Therefore, we designed an experiment in which subjects observed human and artificial agents perform pointing and grasping actions aimed at two different object categories suggesting different goals. We found a signal increase in the bilateral inferior parietal lobule and the premotor cortex when tool versus food items were pointed to or grasped by both agents, probably reflecting the association of hand actions with the functional use of tools. Our results show that goal attribution engages the fronto-parietal network not only for observing a human but also a robotic agent for both self-related and social actions. The debriefing after the experiment has shown that actions of human-like artificial agents can be perceived as being goal-directed. Therefore, humans will be able to interact with service robots intuitively in various domains such as education, healthcare, public service, and entertainment.
Collapse
Affiliation(s)
- Aleksandra Kupferberg
- Division of Molecular Psychiatry, Translational Research Center, University Hospital of Psychiatry University of Bern, Bern, Switzerland
| | - Marco Iacoboni
- David Geffen School of Medicine at UCLA, Ahmanson-Lovelace Brain Mapping Center, Semel Institute for Neuroscience and Human Behavior, Brain Research Institute, Los Angeles, California
| | - Virginia Flanagin
- German Center for Vertigo and Balance Disorders DSGZ, Ludwig-Maximilian University Munich, München, Germany.,Center for Sensorimotor Research, Department of Neurology, Ludwig-Maximilian University, München, Germany
| | - Markus Huber
- Center for Sensorimotor Research, Department of Neurology, Ludwig-Maximilian University, München, Germany
| | | | - Thomas Baumgartner
- Department of Social Psychology and Social Neuroscience, University of Bern, Bern, Switzerland
| | - Gregor Hasler
- Division of Molecular Psychiatry, Translational Research Center, University Hospital of Psychiatry University of Bern, Bern, Switzerland
| | - Florian Schmidt
- Department of Robotics, DLR, Oberpfaffenhofen, Bavaria, Germany
| | - Christoph Borst
- Department of Robotics, DLR, Oberpfaffenhofen, Bavaria, Germany
| | - Stefan Glasauer
- German Center for Vertigo and Balance Disorders DSGZ, Ludwig-Maximilian University Munich, München, Germany.,Center for Sensorimotor Research, Department of Neurology, Ludwig-Maximilian University, München, Germany
| |
Collapse
|
19
|
Wiese E, Metta G, Wykowska A. Robots As Intentional Agents: Using Neuroscientific Methods to Make Robots Appear More Social. Front Psychol 2017; 8:1663. [PMID: 29046651 PMCID: PMC5632653 DOI: 10.3389/fpsyg.2017.01663] [Citation(s) in RCA: 100] [Impact Index Per Article: 14.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/16/2017] [Accepted: 09/11/2017] [Indexed: 12/30/2022] Open
Abstract
Robots are increasingly envisaged as our future cohabitants. However, while considerable progress has been made in recent years in terms of their technological realization, the ability of robots to interact with humans in an intuitive and social way is still quite limited. An important challenge for social robotics is to determine how to design robots that can perceive the user's needs, feelings, and intentions, and adapt to users over a broad range of cognitive abilities. It is conceivable that if robots were able to adequately demonstrate these skills, humans would eventually accept them as social companions. We argue that the best way to achieve this is using a systematic experimental approach based on behavioral and physiological neuroscience methods such as motion/eye-tracking, electroencephalography, or functional near-infrared spectroscopy embedded in interactive human-robot paradigms. This approach requires understanding how humans interact with each other, how they perform tasks together and how they develop feelings of social connection over time, and using these insights to formulate design principles that make social robots attuned to the workings of the human brain. In this review, we put forward the argument that the likelihood of artificial agents being perceived as social companions can be increased by designing them in a way that they are perceived as intentional agents that activate areas in the human brain involved in social-cognitive processing. We first review literature related to social-cognitive processes and mechanisms involved in human-human interactions, and highlight the importance of perceiving others as intentional agents to activate these social brain areas. We then discuss how attribution of intentionality can positively affect human-robot interaction by (a) fostering feelings of social connection, empathy and prosociality, and by (b) enhancing performance on joint human-robot tasks. Lastly, we describe circumstances under which attribution of intentionality to robot agents might be disadvantageous, and discuss challenges associated with designing social robots that are inspired by neuroscientific principles.
Collapse
Affiliation(s)
- Eva Wiese
- Department of Psychology, George Mason University, Fairfax, VA, United States
| | | | | |
Collapse
|
20
|
How can the study of action kinematics inform our understanding of human social interaction? Neuropsychologia 2017; 105:101-110. [DOI: 10.1016/j.neuropsychologia.2017.01.018] [Citation(s) in RCA: 26] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 10/18/2016] [Revised: 01/17/2017] [Accepted: 01/18/2017] [Indexed: 11/17/2022]
|
21
|
Coco MI, Badino L, Cipresso P, Chirico A, Ferrari E, Riva G, Gaggioli A, D'Ausilio A. Multilevel Behavioral Synchronization in a Joint Tower-Building Task. IEEE Trans Cogn Dev Syst 2017. [DOI: 10.1109/tcds.2016.2545739] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
22
|
Brand J, Piccirelli M, Hepp-Reymond MC, Morari M, Michels L, Eng K. Virtual Hand Feedback Reduces Reaction Time in an Interactive Finger Reaching Task. PLoS One 2016; 11:e0154807. [PMID: 27144927 PMCID: PMC4856322 DOI: 10.1371/journal.pone.0154807] [Citation(s) in RCA: 8] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/03/2015] [Accepted: 04/19/2016] [Indexed: 11/25/2022] Open
Abstract
Computer interaction via visually guided hand or finger movements is a ubiquitous part of daily computer usage in work or gaming. Surprisingly, however, little is known about the performance effects of using virtual limb representations versus simpler cursors. In this study 26 healthy right-handed adults performed cued index finger flexion-extension movements towards an on-screen target while wearing a data glove. They received each of four different types of real-time visual feedback: a simple circular cursor, a point light pattern indicating finger joint positions, a cartoon hand and a fully shaded virtual hand. We found that participants initiated the movements faster when receiving feedback in the form of a hand than when receiving circular cursor or point light feedback. This overall difference was robust for three out of four hand versus circle pairwise comparisons. The faster movement initiation for hand feedback was accompanied by a larger movement amplitude and a larger movement error. We suggest that the observed effect may be related to priming of hand information during action perception and execution affecting motor planning and execution. The results may have applications in the use of body representations in virtual reality applications.
Collapse
Affiliation(s)
- Johannes Brand
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland
- Automatic Control Laboratory, ETH Zurich, Zurich, Switzerland
| | - Marco Piccirelli
- Institute of Neuroradiology, University Hospital Zurich, Zurich, Switzerland
| | | | - Manfred Morari
- Automatic Control Laboratory, ETH Zurich, Zurich, Switzerland
| | - Lars Michels
- Institute of Neuroradiology, University Hospital Zurich, Zurich, Switzerland
- Centre for MR-Research, University Children’s Hospital, Zurich, Switzerland
| | - Kynan Eng
- Institute of Neuroinformatics, University of Zurich and ETH Zurich, Zurich, Switzerland
- * E-mail:
| |
Collapse
|
23
|
Urquiza-Haas EG, Kotrschal K. The mind behind anthropomorphic thinking: attribution of mental states to other species. Anim Behav 2015. [DOI: 10.1016/j.anbehav.2015.08.011] [Citation(s) in RCA: 98] [Impact Index Per Article: 10.9] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
24
|
Hofree G, Urgen BA, Winkielman P, Saygin AP. Observation and imitation of actions performed by humans, androids, and robots: an EMG study. Front Hum Neurosci 2015; 9:364. [PMID: 26150782 PMCID: PMC4473002 DOI: 10.3389/fnhum.2015.00364] [Citation(s) in RCA: 21] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/13/2014] [Accepted: 06/08/2015] [Indexed: 11/20/2022] Open
Abstract
Understanding others' actions is essential for functioning in the physical and social world. In the past two decades research has shown that action perception involves the motor system, supporting theories that we understand others' behavior via embodied motor simulation. Recently, empirical approach to action perception has been facilitated by using well-controlled artificial stimuli, such as robots. One broad question this approach can address is what aspects of similarity between the observer and the observed agent facilitate motor simulation. Since humans have evolved among other humans and animals, using artificial stimuli such as robots allows us to probe whether our social perceptual systems are specifically tuned to process other biological entities. In this study, we used humanoid robots with different degrees of human-likeness in appearance and motion along with electromyography (EMG) to measure muscle activity in participants' arms while they either observed or imitated videos of three agents produce actions with their right arm. The agents were a Human (biological appearance and motion), a Robot (mechanical appearance and motion), and an Android (biological appearance and mechanical motion). Right arm muscle activity increased when participants imitated all agents. Increased muscle activation was found also in the stationary arm both during imitation and observation. Furthermore, muscle activity was sensitive to motion dynamics: activity was significantly stronger for imitation of the human than both mechanical agents. There was also a relationship between the dynamics of the muscle activity and motion dynamics in stimuli. Overall our data indicate that motor simulation is not limited to observation and imitation of agents with a biological appearance, but is also found for robots. However we also found sensitivity to human motion in the EMG responses. Combining data from multiple methods allows us to obtain a more complete picture of action understanding and the underlying neural computations.
Collapse
Affiliation(s)
- Galit Hofree
- Department of Psychology, University of California, San Diego, San Diego, CAUSA
| | - Burcu A. Urgen
- Department of Cognitive Science, University of California, San Diego, San Diego, CAUSA
| | - Piotr Winkielman
- Department of Psychology, University of California, San Diego, San Diego, CAUSA
- Behavioural Science Group, Warwick Business School, University of Warwick, CoventryUK
- Department of Psychology, University of Social Sciences and Humanities, WarsawPoland
| | - Ayse P. Saygin
- Department of Cognitive Science, University of California, San Diego, San Diego, CAUSA
| |
Collapse
|
25
|
Do robots have goals? How agent cues influence action understanding in non-human primates. Behav Brain Res 2013; 246:47-54. [DOI: 10.1016/j.bbr.2013.01.047] [Citation(s) in RCA: 19] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/01/2012] [Revised: 12/16/2012] [Accepted: 01/30/2013] [Indexed: 11/22/2022]
|
26
|
Huber M, Kupferberg A, Lenz C, Knoll A, Brandt T, Glasauer S. Spatiotemporal movement planning and rapid adaptation for manual interaction. PLoS One 2013; 8:e64982. [PMID: 23724112 PMCID: PMC3665711 DOI: 10.1371/journal.pone.0064982] [Citation(s) in RCA: 11] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/10/2012] [Accepted: 04/19/2013] [Indexed: 12/13/2022] Open
Abstract
Many everyday tasks require the ability of two or more individuals to coordinate their actions with others to increase efficiency. Such an increase in efficiency can often be observed even after only very few trials. Previous work suggests that such behavioral adaptation can be explained within a probabilistic framework that integrates sensory input and prior experience. Even though higher cognitive abilities such as intention recognition have been described as probabilistic estimation depending on an internal model of the other agent, it is not clear whether much simpler daily interaction is consistent with a probabilistic framework. Here, we investigate whether the mechanisms underlying efficient coordination during manual interactions can be understood as probabilistic optimization. For this purpose we studied in several experiments a simple manual handover task concentrating on the action of the receiver. We found that the duration until the receiver reacts to the handover decreases over trials, but strongly depends on the position of the handover. We then replaced the human deliverer by different types of robots to further investigate the influence of the delivering movement on the reaction of the receiver. Durations were found to depend on movement kinematics and the robot’s joint configuration. Modeling the task was based on the assumption that the receiver’s decision to act is based on the accumulated evidence for a specific handover position. The evidence for this handover position is collected from observing the hand movement of the deliverer over time and, if appropriate, by integrating this sensory likelihood with prior expectation that is updated over trials. The close match of model simulations and experimental results shows that the efficiency of handover coordination can be explained by an adaptive probabilistic fusion of a-priori expectation and online estimation.
Collapse
Affiliation(s)
- Markus Huber
- Center for Sensorimotor Research, Institute for Clinical Neuroscience, Ludwig-Maximilian University Munich, Munich, Germany.
| | | | | | | | | | | |
Collapse
|
27
|
Ménoret M, Curie A, des Portes V, Nazir TA, Paulignan Y. Simultaneous action execution and observation optimise grasping actions. Exp Brain Res 2013; 227:407-19. [PMID: 23615976 DOI: 10.1007/s00221-013-3523-3] [Citation(s) in RCA: 10] [Impact Index Per Article: 0.9] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 12/17/2012] [Accepted: 04/09/2013] [Indexed: 11/24/2022]
Abstract
Action observation and execution share overlapping neural resonating mechanisms. In the present study, we sought to examine the effect of the activation of this system during concurrent movement observation and execution in a prehension task, when no a priori information about the requirements of grasping action was available. Although it is known that simultaneous activation by observation and execution influences motor performance, the importance of the delays of these two events and the specific effect of movement observation itself (and not the prediction of the to-be-observed movement) on action performance are poorly known. Fine-grained kinematic analysis of both the transport and grasp components of the movement should provide knowledge about the influence of movement observation on the precision and the performance of the executed movement. The experiment involved two real participants who were asked to grasp a different side of a single object that was composed of a large and a small part. In the first experiment, we measured how the transport component and the grasp component were affected by movement observation. We tested whether this influence was greater if the observed movement occurred just before the onset of movement (200 ms) or well before the onset of movement (1 s). In a second experiment, to reproduce the previous experiment and to verify the specificity of the grasping movements, we also included a condition consisting of pointing towards the object. Both experiments showed two main results. A general facilitation of the transport component was found when observing a simultaneous action, independent of its congruency. Moreover, a specific facilitation of the grasp component was present during the observation of a congruent action when movement execution and observation were nearly synchronised. While the general facilitation may arise from a competition between the two participants as they reached for the object, the specific facilitation of the grasp component seems to be directly related to mirror neuron system activity induced by action observation itself. Moreover, the time course of the events appears to be an essential factor for this modulation, implying the transitory activation of the mirror neuron system.
Collapse
Affiliation(s)
- Mathilde Ménoret
- Laboratoire sur le Langage, le Cerveau et la Cognition UMR 5304, CNRS/University of Lyon 1, 67, Boulevard Pinel, 69675 Bron Cedex, France.
| | | | | | | | | |
Collapse
|