1
|
Wang X, Li Z, Wang S, Yang Y, Peng Y, Fu C. Enhancing emotional expression in cat-like robots: strategies for utilizing tail movements with human-like gazes. Front Robot AI 2024; 11:1399012. [PMID: 39076841 PMCID: PMC11284330 DOI: 10.3389/frobt.2024.1399012] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/11/2024] [Accepted: 06/14/2024] [Indexed: 07/31/2024] Open
Abstract
In recent years, there has been a significant growth in research on emotion expression in the field of human-robot interaction. In the process of human-robot interaction, the effect of the robot's emotional expression determines the user's experience and acceptance. Gaze is widely accepted as an important media to express emotions in human-human interaction. But it has been found that users have difficulty in effectively recognizing emotions such as happiness and anger expressed by animaloid robots that use eye contact individually. In addition, in real interaction, effective nonverbal expression includes not only eye contact but also physical expression. However, current animaloid social robots consider human-like eyes as the main emotion expression pathway, which results in a dysfunctional robot appearance and behavioral approach, affecting the quality of emotional expression. Based on retaining the effectiveness of eyes for emotional communication, we added a mechanical tail as a physical expression to enhance the robot's emotional expression in concert with the eyes. The results show that the collaboration between the mechanical tail and the bionic eye enhances emotional expression in all four emotions. Further more, we found that the mechanical tail can enhance the expression of specific emotions with different parameters. The above study is conducive to enhancing the robot's emotional expression ability in human-robot interaction and improving the user's interaction experience.
Collapse
Affiliation(s)
- Xinxiang Wang
- HACI, Sydney Smart Technology College, Northeastern University, Shenyang, China
| | - Zihan Li
- HACI, Sydney Smart Technology College, Northeastern University, Shenyang, China
| | - Songyang Wang
- HACI, Sydney Smart Technology College, Northeastern University, Shenyang, China
| | - Yiming Yang
- HACI, Sydney Smart Technology College, Northeastern University, Shenyang, China
| | - Yibo Peng
- HACI, Sydney Smart Technology College, Northeastern University, Shenyang, China
| | - Changzeng Fu
- HACI, Sydney Smart Technology College, Northeastern University, Shenyang, China
- Hebei Key Laboratory of Marine Perception Network and Data Processing, Northeastern University at Qinhuangdao, Qinhuangdao, China
- IRL, Graduate School of Engineering Science, Osaka University, Suita, Japan
| |
Collapse
|
2
|
Long-Term Exercise Assistance: Group and One-on-One Interactions between a Social Robot and Seniors. ROBOTICS 2023. [DOI: 10.3390/robotics12010009] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/11/2023] Open
Abstract
For older adults, regular exercises can provide both physical and mental benefits, increase their independence, and reduce the risks of diseases associated with aging. However, only a small portion of older adults regularly engage in physical activity. Therefore, it is important to promote exercise among older adults to help maintain overall health. In this paper, we present the first exploratory long-term human–robot interaction (HRI) study conducted at a local long-term care facility to investigate the benefits of one-on-one and group exercise interactions with an autonomous socially assistive robot and older adults. To provide targeted facilitation, our robot utilizes a unique emotion model that can adapt its assistive behaviors to users’ affect and track their progress towards exercise goals through repeated sessions using the Goal Attainment Scale (GAS), while also monitoring heart rate to prevent overexertion. Results of the study show that users had positive valence and high engagement towards the robot and were able to maintain their exercise performance throughout the study. Questionnaire results showed high robot acceptance for both types of interactions. However, users in the one-on-one sessions perceived the robot as more sociable and intelligent, and had more positive perception of the robot’s appearance and movements.
Collapse
|
3
|
Qiao H, Chen J, Huang X. A Survey of Brain-Inspired Intelligent Robots: Integration of Vision, Decision, Motion Control, and Musculoskeletal Systems. IEEE TRANSACTIONS ON CYBERNETICS 2022; 52:11267-11280. [PMID: 33909584 DOI: 10.1109/tcyb.2021.3071312] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/12/2023]
Abstract
Current robotic studies are focused on the performance of specific tasks. However, such tasks cannot be generalized, and some special tasks, such as compliant and precise manipulation, fast and flexible response, and deep collaboration between humans and robots, cannot be realized. Brain-inspired intelligent robots imitate humans and animals, from inner mechanisms to external structures, through an integration of visual cognition, decision making, motion control, and musculoskeletal systems. This kind of robot is more likely to realize the functions that current robots cannot realize and become human friends. With the focus on the development of brain-inspired intelligent robots, this article reviews cutting-edge research in the areas of brain-inspired visual cognition, decision making, musculoskeletal robots, motion control, and their integration. It aims to provide greater insight into brain-inspired intelligent robots and attracts more attention to this field from the global research community.
Collapse
|
4
|
Penčić M, Čavić M, Oros D, Vrgović P, Babković K, Orošnjak M, Čavić D. Anthropomorphic Robotic Eyes: Structural Design and Non-Verbal Communication Effectiveness. SENSORS (BASEL, SWITZERLAND) 2022; 22:3060. [PMID: 35459046 PMCID: PMC9024502 DOI: 10.3390/s22083060] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/14/2022] [Revised: 04/05/2022] [Accepted: 04/12/2022] [Indexed: 02/04/2023]
Abstract
This paper shows the structure of a mechanical system with 9 DOFs for driving robot eyes, as well as the system's ability to produce facial expressions. It consists of three subsystems which enable the motion of the eyeballs, eyelids, and eyebrows independently to the rest of the face. Due to its structure, the mechanical system of the eyeballs is able to reproduce all of the motions human eyes are capable of, which is an important condition for the realization of binocular function of the artificial robot eyes, as well as stereovision. From a kinematic standpoint, the mechanical systems of the eyeballs, eyelids, and eyebrows are highly capable of generating the movements of the human eye. The structure of a control system is proposed with the goal of realizing the desired motion of the output links of the mechanical systems. The success of the mechanical system is also rated on how well it enables the robot to generate non-verbal emotional content, which is why an experiment was conducted. Due to this, the face of the human-like robot MARKO was used, covered with a face mask to aid in focusing the participants on the eye region. The participants evaluated the efficiency of the robot's non-verbal communication, with certain emotions achieving a high rate of recognition.
Collapse
Affiliation(s)
- Marko Penčić
- Faculty of Technical Sciences, University of Novi Sad, Trg Dositeja Obradovića 6, 21000 Novi Sad, Serbia; (M.Č.); (D.O.); (P.V.); (K.B.); (M.O.); (D.Č.)
| | | | | | | | | | | | | |
Collapse
|
5
|
Saunderson S, Nejat G. Investigating Strategies for Robot Persuasion in Social Human-Robot Interaction. IEEE TRANSACTIONS ON CYBERNETICS 2022; 52:641-653. [PMID: 32452790 DOI: 10.1109/tcyb.2020.2987463] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/11/2023]
Abstract
Persuasion is a fundamental aspect of how people interact with each other. As robots become integrated into our daily lives and take on increasingly social roles, their ability to persuade will be critical to their success during human-robot interaction (HRI). In this article, we present a novel HRI study that investigates how a robot's persuasive behavior influences people's decision making. The study consisted of two small social robots trying to influence a person's answer during a jelly bean guessing game. One robot used either an emotional or logical persuasive strategy during the game, while the other robot displayed a neutral control behavior. The results showed that the Emotion strategy had significantly higher persuasive influence compared to both the Logic and Control conditions. With respect to participant demographics, no significant differences in influence were observed between age or gender groups; however, significant differences were observed when considering participant occupation/field of study (FOS). Namely, participants in business, engineering, and physical sciences fields were more influenced by the robots and aligned their answers closer to the robot's suggestion than did those in the life sciences and humanities professions. The discussions provide insight into the potential use of robot persuasion in social HRI task scenarios; in particular, considering the influence that a robot displaying emotional behaviors has when persuading people.
Collapse
|
6
|
Hong A, Lunscher N, Hu T, Tsuboi Y, Zhang X, Franco Dos Reis Alves S, Nejat G, Benhabib B. A Multimodal Emotional Human-Robot Interaction Architecture for Social Robots Engaged in Bidirectional Communication. IEEE TRANSACTIONS ON CYBERNETICS 2021; 51:5954-5968. [PMID: 32149676 DOI: 10.1109/tcyb.2020.2974688] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/10/2023]
Abstract
For social robots to effectively engage in human-robot interaction (HRI), they need to be able to interpret human affective cues and to respond appropriately via display of their own emotional behavior. In this article, we present a novel multimodal emotional HRI architecture to promote natural and engaging bidirectional emotional communications between a social robot and a human user. User affect is detected using a unique combination of body language and vocal intonation, and multimodal classification is performed using a Bayesian Network. The Emotionally Expressive Robot utilizes the user's affect to determine its own emotional behavior via an innovative two-layer emotional model consisting of deliberative (hidden Markov model) and reactive (rule-based) layers. The proposed architecture has been implemented via a small humanoid robot to perform diet and fitness counseling during HRI. In order to evaluate the Emotionally Expressive Robot's effectiveness, a Neutral Robot that can detect user affects but lacks an emotional display, was also developed. A between-subjects HRI experiment was conducted with both types of robots. Extensive results have shown that both robots can effectively detect user affect during the real-time HRI. However, the Emotionally Expressive Robot can appropriately determine its own emotional response based on the situation at hand and, therefore, induce more user positive valence and less negative arousal than the Neutral Robot.
Collapse
|
7
|
Chang Y, Sun L. EEG-Based Emotion Recognition for Modulating Social-Aware Robot Navigation. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2021; 2021:5709-5712. [PMID: 34892417 DOI: 10.1109/embc46164.2021.9630721] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
Companion robots play an important role to accompany humans and provide emotional support, such as reducing human social isolation and loneliness. Based on recognizing human partner's mental states, a companion robot is able to dynamically adjust its behaviors, and make human-robot interaction smoother and natural. Human emotion has been recognized by many modalities like facial expression and voice. Neurophysiological signals have shown promising results in emotion recognition, since it is an innate signal of human brain which cannot be faked. In this paper, emotional state recognition using a neurophysiology method is studied to guide and modulate companion-robot navigation to enhance its social capabilities. Electroencephalogram (EEG), a type of neurophysiological signals, is used to recognize human emotional state, and then feed into a navigation path planning algorithm for controlling a companion robot's routes. Simulation results show that mobile robot presents navigation behaviors modulated by dynamic human emotional states.
Collapse
|
8
|
Shourmasti ES, Colomo-Palacios R, Holone H, Demi S. User Experience in Social Robots. SENSORS (BASEL, SWITZERLAND) 2021; 21:5052. [PMID: 34372289 PMCID: PMC8348916 DOI: 10.3390/s21155052] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 05/28/2021] [Revised: 07/20/2021] [Accepted: 07/23/2021] [Indexed: 11/16/2022]
Abstract
Social robots are increasingly penetrating our daily lives. They are used in various domains, such as healthcare, education, business, industry, and culture. However, introducing this technology for use in conventional environments is not trivial. For users to accept social robots, a positive user experience is vital, and it should be considered as a critical part of the robots' development process. This may potentially lead to excessive use of social robots and strengthen their diffusion in society. The goal of this study is to summarize the extant literature that is focused on user experience in social robots, and to identify the challenges and benefits of UX evaluation in social robots. To achieve this goal, the authors carried out a systematic literature review that relies on PRISMA guidelines. Our findings revealed that the most common methods to evaluate UX in social robots are questionnaires and interviews. UX evaluations were found out to be beneficial in providing early feedback and consequently in handling errors at an early stage. However, despite the importance of UX in social robots, robot developers often neglect to set UX goals due to lack of knowledge or lack of time. This study emphasizes the need for robot developers to acquire the required theoretical and practical knowledge on how to perform a successful UX evaluation.
Collapse
Affiliation(s)
| | - Ricardo Colomo-Palacios
- Department of Computer Science, Østfold University College, 1783 Halden, Norway; (E.S.S.); (H.H.); (S.D.)
| | | | | |
Collapse
|
9
|
Hewitt R. Assistive Care Robots and Older Adults: Employing a Care Ethics Lens. CANADIAN JOURNAL OF BIOETHICS 2021. [DOI: 10.7202/1077637ar] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/17/2022] Open
Abstract
To date, ethical critiques of the use of assistive healthcare robotics have not closely examined the purported care relationship between such robots and their users. Drawing upon the work of care ethics scholars, I argue that authentic care relies upon capacities inherently reciprocal and responsive in nature, which ultimately precludes socially assistive robots from being useful caring tools.
Collapse
Affiliation(s)
- Rachel Hewitt
- Division of Community Health and Humanities, Faculty of Medicine, Memorial University of Newfoundland, St. John’s, Canada
| |
Collapse
|
10
|
Abel: Integrating Humanoid Body, Emotions, and Time Perception to Investigate Social Interaction and Human Cognition. APPLIED SCIENCES-BASEL 2021. [DOI: 10.3390/app11031070] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Humanoids have been created for assisting or replacing humans in many applications, providing encouraging results in contexts where social and emotional interaction is required, such as healthcare, education, and therapy. Bioinspiration, that has often guided the design of their bodies and minds, made them also become excellent research tools, probably the best platform by which we can model, test, and understand the human mind and behavior. Driven by the aim of creating a believable robot for interactive applications, as well as a research platform for investigating human cognition and emotion, we are constructing a new humanoid social robot: Abel. In this paper, we discussed three of the fundamental principles that motivated the design of Abel and its cognitive and emotional system: hyper-realistic humanoid aesthetics, human-inspired emotion processing, and human-like perception of time. After reporting a brief state-of-the-art on the related topics, we present the robot at its stage of development, what are the perspectives for its application, and how it could satisfy the expectations as a tool to investigate the human mind, behavior, and consciousness.
Collapse
|
11
|
Alimardani M, Hiraki K. Passive Brain-Computer Interfaces for Enhanced Human-Robot Interaction. Front Robot AI 2020; 7:125. [PMID: 33501291 PMCID: PMC7805996 DOI: 10.3389/frobt.2020.00125] [Citation(s) in RCA: 12] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 03/05/2020] [Accepted: 08/05/2020] [Indexed: 11/13/2022] Open
Abstract
Brain-computer interfaces (BCIs) have long been seen as control interfaces that translate changes in brain activity, produced either by means of a volitional modulation or in response to an external stimulation. However, recent trends in the BCI and neurofeedback research highlight passive monitoring of a user's brain activity in order to estimate cognitive load, attention level, perceived errors and emotions. Extraction of such higher order information from brain signals is seen as a gateway for facilitation of interaction between humans and intelligent systems. Particularly in the field of robotics, passive BCIs provide a promising channel for prediction of user's cognitive and affective state for development of a user-adaptive interaction. In this paper, we first illustrate the state of the art in passive BCI technology and then provide examples of BCI employment in human-robot interaction (HRI). We finally discuss the prospects and challenges in integration of passive BCIs in socially demanding HRI settings. This work intends to inform HRI community of the opportunities offered by passive BCI systems for enhancement of human-robot interaction while recognizing potential pitfalls.
Collapse
Affiliation(s)
- Maryam Alimardani
- Department of Cognitive Science and Artificial Intelligence, School of Humanities and Digital Sciences, Tilburg University, Tilburg, Netherlands
| | - Kazuo Hiraki
- Department of General Systems Studies, Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
| |
Collapse
|
12
|
Abstract
To effectively communicate with people, social robots must be capable of detecting, interpreting, and responding to human affect during human–robot interactions (HRIs). In order to accurately detect user affect during HRIs, affect elicitation techniques need to be developed to create and train appropriate affect detection models. In this paper, we present such a novel affect elicitation and detection method for social robots in HRIs. Non-verbal emotional behaviors of the social robot were designed to elicit user affect, which was directly measured through electroencephalography (EEG) signals. HRI experiments with both younger and older adults were conducted to evaluate our affect elicitation technique and compare the two types of affect detection models we developed and trained utilizing multilayer perceptron neural networks (NNs) and support vector machines (SVMs). The results showed that; on average, the self-reported valence and arousal were consistent with the intended elicited affect. Furthermore, it was also noted that the EEG data obtained could be used to train affect detection models with the NN models achieving higher classification rates
Collapse
|
13
|
Pan L, Yin Z, She S, Song A. Emotional State Recognition from Peripheral Physiological Signals Using Fused Nonlinear Features and Team-Collaboration Identification Strategy. ENTROPY 2020; 22:e22050511. [PMID: 33286283 PMCID: PMC7517002 DOI: 10.3390/e22050511] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 03/25/2020] [Revised: 04/25/2020] [Accepted: 04/27/2020] [Indexed: 11/16/2022]
Abstract
Emotion recognition realizing human inner perception has a very important application prospect in human-computer interaction. In order to improve the accuracy of emotion recognition, a novel method combining fused nonlinear features and team-collaboration identification strategy was proposed for emotion recognition using physiological signals. Four nonlinear features, namely approximate entropy (ApEn), sample entropy (SaEn), fuzzy entropy (FuEn) and wavelet packet entropy (WpEn) are employed to reflect emotional states deeply with each type of physiological signal. Then the features of different physiological signals are fused to represent the emotional states from multiple perspectives. Each classifier has its own advantages and disadvantages. In order to make full use of the advantages of other classifiers and avoid the limitation of single classifier, the team-collaboration model is built and the team-collaboration decision-making mechanism is designed according to the proposed team-collaboration identification strategy which is based on the fusion of support vector machine (SVM), decision tree (DT) and extreme learning machine (ELM). Through analysis, SVM is selected as the main classifier with DT and ELM as auxiliary classifiers. According to the designed decision-making mechanism, the proposed team-collaboration identification strategy can effectively employ different classification methods to make decision based on the characteristics of the samples through SVM classification. For samples which are easy to be identified by SVM, SVM directly determines the identification results, whereas SVM-DT-ELM collaboratively determines the identification results, which can effectively utilize the characteristics of each classifier and improve the classification accuracy. The effectiveness and universality of the proposed method are verified by Augsburg database and database for emotion analysis using physiological (DEAP) signals. The experimental results uniformly indicated that the proposed method combining fused nonlinear features and team-collaboration identification strategy presents better performance than the existing methods.
Collapse
Affiliation(s)
- Lizheng Pan
- School of Mechanical Engineering, Changzhou University, Changzhou 213164, China; (Z.Y.); (S.S.)
- Remote Measurement and Control Key Lab of Jiangsu Province, School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China;
- Correspondence:
| | - Zeming Yin
- School of Mechanical Engineering, Changzhou University, Changzhou 213164, China; (Z.Y.); (S.S.)
| | - Shigang She
- School of Mechanical Engineering, Changzhou University, Changzhou 213164, China; (Z.Y.); (S.S.)
| | - Aiguo Song
- Remote Measurement and Control Key Lab of Jiangsu Province, School of Instrument Science and Engineering, Southeast University, Nanjing 210096, China;
| |
Collapse
|
14
|
Abstract
Brain-computer interfaces (BCIs) have long been seen as control interfaces that translate changes in brain activity, produced either by means of a volitional modulation or in response to an external stimulation. However, recent trends in the BCI and neurofeedback research highlight passive monitoring of a user's brain activity in order to estimate cognitive load, attention level, perceived errors and emotions. Extraction of such higher order information from brain signals is seen as a gateway for facilitation of interaction between humans and intelligent systems. Particularly in the field of robotics, passive BCIs provide a promising channel for prediction of user's cognitive and affective state for development of a user-adaptive interaction. In this paper, we first illustrate the state of the art in passive BCI technology and then provide examples of BCI employment in human-robot interaction (HRI). We finally discuss the prospects and challenges in integration of passive BCIs in socially demanding HRI settings. This work intends to inform HRI community of the opportunities offered by passive BCI systems for enhancement of human-robot interaction while recognizing potential pitfalls.
Collapse
Affiliation(s)
- Maryam Alimardani
- Department of Cognitive Science and Artificial Intelligence, School of Humanities and Digital Sciences, Tilburg University, Tilburg, Netherlands
| | - Kazuo Hiraki
- Department of General Systems Studies, Graduate School of Arts and Sciences, The University of Tokyo, Tokyo, Japan
| |
Collapse
|
15
|
Abstract
The cooperation between humans and robots is becoming increasingly important in our society. Consequently, there is a growing interest in the development of models that can enhance and enrich the interaction between humans and robots. A key challenge in the Human-Robot Interaction (HRI) field is to provide robots with cognitive and affective capabilities, by developing architectures that let them establish empathetic relationships with users. Over the last several years, multiple models were proposed to face this open-challenge. This work provides a survey of the most relevant attempts/works. In details, it offers an overview of the architectures present in literature focusing on three specific aspects of HRI: the development of adaptive behavioral models, the design of cognitive architectures, and the ability to establish empathy with the user. The research was conducted within two databases: Scopus and Web of Science. Accurate exclusion criteria were applied to screen the 4916 articles found. At the end, 56 articles were selected. For each work, an evaluation of the model is made. Pros and cons of each work are detailed by analyzing the aspects that can be improved to establish an enjoyable interaction between robots and users.
Collapse
|
16
|
Lin CH, Wang SH, Lin CJ. Interval Type-2 Neural Fuzzy Controller-Based Navigation of Cooperative Load-Carrying Mobile Robots in Unknown Environments. SENSORS 2018; 18:s18124181. [PMID: 30487466 PMCID: PMC6308668 DOI: 10.3390/s18124181] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/20/2018] [Revised: 11/23/2018] [Accepted: 11/24/2018] [Indexed: 11/23/2022]
Abstract
In this paper, a navigation method is proposed for cooperative load-carrying mobile robots. The behavior mode manager is used efficaciously in the navigation control method to switch between two behavior modes, wall-following mode (WFM) and goal-oriented mode (GOM), according to various environmental conditions. Additionally, an interval type-2 neural fuzzy controller based on dynamic group artificial bee colony (DGABC) is proposed in this paper. Reinforcement learning was used to develop the WFM adaptively. First, a single robot is trained to learn the WFM. Then, this control method is implemented for cooperative load-carrying mobile robots. In WFM learning, the proposed DGABC performs better than the original artificial bee colony algorithm and other improved algorithms. Furthermore, the results of cooperative load-carrying navigation control tests demonstrate that the proposed cooperative load-carrying method and the navigation method can enable the robots to carry the task item to the goal and complete the navigation mission efficiently.
Collapse
Affiliation(s)
- Chun-Hui Lin
- Department of Computer Science & Information Engineering, Nation Cheng Kung University, Tainan 701, Taiwan.
| | - Shyh-Hau Wang
- Department of Computer Science & Information Engineering, Nation Cheng Kung University, Tainan 701, Taiwan.
| | - Cheng-Jian Lin
- Department of Computer Science & Information Engineering, National Chin-Yi University of Technology, Taichung 411, Taiwan.
| |
Collapse
|
17
|
Chen L, Wu M, Zhou M, She J, Dong F, Hirota K. Information-Driven Multirobot Behavior Adaptation to Emotional Intention in Human–Robot Interaction. IEEE Trans Cogn Dev Syst 2018. [DOI: 10.1109/tcds.2017.2728003] [Citation(s) in RCA: 30] [Impact Index Per Article: 4.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
18
|
Rossi S, Ferland F, Tapus A. User profiling and behavioral adaptation for HRI: A survey. Pattern Recognit Lett 2017. [DOI: 10.1016/j.patrec.2017.06.002] [Citation(s) in RCA: 101] [Impact Index Per Article: 12.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/26/2022]
|
19
|
|