1
|
Tian B, Zheng Y, Zhuang Z, Luo H, Zhang Y, Wang D. Group Haptic Collaboration: Evaluation of Teamwork Behavior during VR Four-Person Rowing Task. IEEE TRANSACTIONS ON HAPTICS 2024; 17:384-395. [PMID: 38145541 DOI: 10.1109/toh.2023.3346683] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/27/2023]
Abstract
The assessment of multi-person group collaboration has garnered increasing attention in recent years. However, it remains uncertain whether haptic information can be effectively utilized to measure teamwork behavior. This study seeks to evaluate teamwork competency within four-person groups and differentiate the contributions of individual members through a haptic collaborative task. To achieve this, we propose a paradigm in which four crews collaboratively manipulate a simulated boat to row along a target curve in a shared haptic-enabled virtual environment. We define eight features related to boat trajectory and synchronization among the four crews' paddling movements, which serve as indicators of teamwork competency. These features are then integrated into a comprehensive feature, and its correlation with self-reported teamwork competency is analyzed. The results demonstrate a strong positive correlation (r>0.8) between the comprehensive feature and teamwork competency. Additionally, we extract two kinesthetic features that represent the paddling movement preferences of each crew member, enabling us to distinguish their contributions within the group. These two features of the crews with the highest and the lowest contribution in each group were significantly different. This work demonstrates the feasibility of kinesthetic features in evaluating teamwork behavior during multi-person haptic collaboration tasks.
Collapse
|
2
|
Short MR, Ludvig D, Kucuktabak EB, Wen Y, Vianello L, Perreault EJ, Hargrove L, Lynch K, Pons JL. Haptic Human-Human Interaction During an Ankle Tracking Task: Effects of Virtual Connection Stiffness. IEEE Trans Neural Syst Rehabil Eng 2023; 31:3864-3873. [PMID: 37747854 DOI: 10.1109/tnsre.2023.3319291] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 09/27/2023]
Abstract
While treating sensorimotor impairments, a therapist may provide physical assistance by guiding their patient's limb to teach a desired movement. In this scenario, a key aspect is the compliance of the interaction, as the therapist can provide subtle cues or impose a movement as demonstration. One approach to studying these interactions involves haptically connecting two individuals through robotic interfaces. Upper-limb studies have shown that pairs of connected individuals estimate one another's goals during tracking tasks by exchanging haptic information, resulting in improved performance dependent on the ability of one's partner and the stiffness of the virtual connection. In this study, our goal was to investigate whether these findings generalize to the lower limb during an ankle tracking task. Pairs of healthy participants (i.e., dyads) independently tracked target trajectories with and without connections rendered between two ankle robots. We tested the effects of connection stiffness as well as visual noise to manipulate the correlation of tracking errors between partners. In our analysis, we compared changes in task performance across conditions while tracking with and without the connection. We found that tracking improvements while connected increased with connection stiffness, favoring the worse partner in the dyad during hard connections. We modeled the interaction as three springs in series, considering the stiffness of the connection and each partners' ankle, to show that improvements were likely due to a cancellation of random tracking errors between partners. These results suggest a simplified mechanism of improvements compared to what has been reported during upper-limb dyadic tracking.
Collapse
|
3
|
Kucuktabak EB, Wen Y, Short M, Demirbas E, Lynch K, Pons J. Virtual Physical Coupling of Two Lower-Limb Exoskeletons. IEEE Int Conf Rehabil Robot 2023; 2023:1-6. [PMID: 37941279 DOI: 10.1109/icorr58425.2023.10304601] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2023]
Abstract
Physical interaction between individuals plays an important role in human motor learning and performance during shared tasks. Using robotic devices, researchers have studied the effects of dyadic haptic interaction mostly focusing on the upper-limb. Developing infrastructure that enables physical interactions between multiple individuals' lower limbs can extend the previous work and facilitate investigation of new dyadic lower-limb rehabilitation schemes. We designed a system to render haptic interactions between two users while they walk in multi-joint lower-limb exoskeletons. Specifically, we developed an infrastructure where desired interaction torques are commanded to the individual lower-limb exoskeletons based on the users' kinematics and the properties of the virtual coupling. In this pilot study, we demonstrated the capacity of the platform to render different haptic properties (e.g., soft and hard), different haptic connection types (e.g., bidirectional and unidirectional), and connections expressed in joint space and in task space. With haptic connection, dyads generated synchronized movement, and the difference between joint angles decreased as the virtual stiffness increased. This is the first study where multi-joint dyadic haptic interactions are created between lower-limb exoskeletons. This platform will be used to investigate effects of haptic interaction on motor learning and task performance during walking, a complex and meaningful task for gait rehabilitation.
Collapse
|
4
|
Börner H, Carboni G, Cheng X, Takagi A, Hirche S, Endo S, Burdet E. Physically interacting humans regulate muscle coactivation to improve visuo-haptic perception. J Neurophysiol 2023; 129:494-499. [PMID: 36651649 PMCID: PMC9942891 DOI: 10.1152/jn.00420.2022] [Citation(s) in RCA: 3] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 01/19/2023] Open
Abstract
When moving a piano or dancing tango with a partner, how should I control my arm muscles to sense their movements and follow or guide them smoothly? Here we observe how physically connected pairs tracking a moving target with the arm modify muscle coactivation with their visual acuity and the partner's performance. They coactivate muscles to stiffen the arm when the partner's performance is worse and relax with blurry visual feedback. Computational modeling shows that this adaptive sensing property cannot be explained by the minimization of movement error hypothesis that has previously explained adaptation in dynamic environments. Instead, individuals skillfully control the stiffness to guide the arm toward the planned motion while minimizing effort and extracting useful information from the partner's movement. The central nervous system regulates muscle activation to guide motion with accurate task information from vision and haptics while minimizing the metabolic cost. As a consequence, the partner with the most accurate target information leads the movement.NEW & NOTEWORTHY Our results reveal that interacting humans inconspicuously modulate muscle activation to extract accurate information about the common target while considering their own and the partner's sensorimotor noise. A novel computational model was developed to decipher the underlying mechanism: muscle coactivation is adapted to combine haptic information from the interaction with the partner and own visual information in a stochastically optimal manner. This improves the prediction of the target position with minimal metabolic cost in each partner, resulting in the lead of the partner with the most accurate visual information.
Collapse
Affiliation(s)
- Hendrik Börner
- 1Electrical and Computer Engineering Department, Technical University of Munich, Munich, Germany
| | - Gerolamo Carboni
- 2Department of Bioengineering, Imperial College of Science, Technology and Medicine, London, United Kingdom
| | - Xiaoxiao Cheng
- 2Department of Bioengineering, Imperial College of Science, Technology and Medicine, London, United Kingdom
| | - Atsushi Takagi
- 3NTT Communication Science Laboratories, Atsugi, Kanagawa, Japan
| | - Sandra Hirche
- 1Electrical and Computer Engineering Department, Technical University of Munich, Munich, Germany
| | - Satoshi Endo
- 1Electrical and Computer Engineering Department, Technical University of Munich, Munich, Germany
| | - Etienne Burdet
- 2Department of Bioengineering, Imperial College of Science, Technology and Medicine, London, United Kingdom
| |
Collapse
|
5
|
Čamernik J, Leskovar RK, Petrič T. Leader–Follower Dynamics in Complex Obstacle Avoidance Task. Int J Soc Robot 2022. [DOI: 10.1007/s12369-022-00945-3] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
6
|
Interaction with a reactive partner improves learning in contrast to passive guidance. Sci Rep 2022; 12:15821. [PMID: 36138031 PMCID: PMC9499977 DOI: 10.1038/s41598-022-18617-7] [Citation(s) in RCA: 8] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 01/26/2022] [Accepted: 08/16/2022] [Indexed: 11/08/2022] Open
Abstract
Many tasks such as physical rehabilitation, vehicle co-piloting or surgical training, rely on physical assistance from a partner. While this assistance may be provided by a robotic interface, how to implement the necessary haptic support to help improve performance without impeding learning is unclear. In this paper, we study the influence of haptic interaction on the performance and learning of a shared tracking task. We compare in a tracking task the interaction with a human partner, the trajectory guidance traditionally used in training robots, and a robot partner yielding human-like interaction. While trajectory guidance resulted in the best performance during training, it dramatically reduced error variability and hindered learning. In contrast, the reactive human and robot partners did not impede the adaptation and allowed the subjects to learn without modifying their movement patterns. Moreover, interaction with a human partner was the only condition that demonstrated an improvement in retention and transfer learning compared to a subject training alone. These results reveal distinctly different learning behaviour in training with a human compared to trajectory guidance, and similar learning between the robotic partner and human partner. Therefore, for movement assistance and learning, algorithms that react to the user's motion and change their behaviour accordingly are better suited.
Collapse
|
7
|
Küçüktabak EB, Kim SJ, Wen Y, Lynch K, Pons JL. Human-machine-human interaction in motor control and rehabilitation: a review. J Neuroeng Rehabil 2021; 18:183. [PMID: 34961530 PMCID: PMC8714449 DOI: 10.1186/s12984-021-00974-5] [Citation(s) in RCA: 7] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 02/09/2021] [Accepted: 12/07/2021] [Indexed: 11/19/2022] Open
Abstract
BACKGROUND Human-human (HH) interaction mediated by machines (e.g., robots or passive sensorized devices), which we call human-machine-human (HMH) interaction, has been studied with increasing interest in the last decade. The use of machines allows the implementation of different forms of audiovisual and/or physical interaction in dyadic tasks. HMH interaction between two partners can improve the dyad's ability to accomplish a joint motor task (task performance) beyond either partner's ability to perform the task solo. It can also be used to more efficiently train an individual to improve their solo task performance (individual motor learning). We review recent research on the impact of HMH interaction on task performance and individual motor learning in the context of motor control and rehabilitation, and we propose future research directions in this area. METHODS A systematic search was performed on the Scopus, IEEE Xplore, and PubMed databases. The search query was designed to find studies that involve HMH interaction in motor control and rehabilitation settings. Studies that do not investigate the effect of changing the interaction conditions were filtered out. Thirty-one studies met our inclusion criteria and were used in the qualitative synthesis. RESULTS Studies are analyzed based on their results related to the effects of interaction type (e.g., audiovisual communication and/or physical interaction), interaction mode (collaborative, cooperative, co-active, and competitive), and partner characteristics. Visuo-physical interaction generally results in better dyadic task performance than visual interaction alone. In cases where the physical interaction between humans is described by a spring, there are conflicting results as to the effect of the stiffness of the spring. In terms of partner characteristics, having a more skilled partner improves dyadic task performance more than having a less skilled partner. However, conflicting results were observed in terms of individual motor learning. CONCLUSIONS Although it is difficult to draw clear conclusions as to which interaction type, mode, or partner characteristic may lead to optimal task performance or individual motor learning, these results show the possibility for improved outcomes through HMH interaction. Future work that focuses on selecting the optimal personalized interaction conditions and exploring their impact on rehabilitation settings may facilitate the transition of HMH training protocols to clinical implementations.
Collapse
Affiliation(s)
- Emek Barış Küçüktabak
- Department of Mechanical Engineering, McCormick School of Engineering, Northwestern University, 60208 Evanston, IL USA
- Legs + Walking Lab, Shirley Ryan Ability Lab, 60611 Chicago, IL USA
| | - Sangjoon J. Kim
- Legs + Walking Lab, Shirley Ryan Ability Lab, 60611 Chicago, IL USA
- Department of Physical Medicine and Rehabilitation, Feinberg School of Medicine, Northwestern University, 60611 Chicago, IL USA
| | - Yue Wen
- Legs + Walking Lab, Shirley Ryan Ability Lab, 60611 Chicago, IL USA
- Department of Physical Medicine and Rehabilitation, Feinberg School of Medicine, Northwestern University, 60611 Chicago, IL USA
| | - Kevin Lynch
- Department of Mechanical Engineering, McCormick School of Engineering, Northwestern University, 60208 Evanston, IL USA
| | - Jose L. Pons
- Department of Mechanical Engineering, McCormick School of Engineering, Northwestern University, 60208 Evanston, IL USA
- Legs + Walking Lab, Shirley Ryan Ability Lab, 60611 Chicago, IL USA
- Department of Physical Medicine and Rehabilitation, Feinberg School of Medicine, Northwestern University, 60611 Chicago, IL USA
- Department of Biomedical Engineering, McCormick School of Engineering, Northwestern University, 60208 Evanston, IL USA
| |
Collapse
|
8
|
Kianzad S, Chen G, MacLean KE. PAL: A Framework for Physically Assisted Learning Through Design and Exploration With a Haptic Robot Buddy. Front Robot AI 2021; 8:700465. [PMID: 34631802 PMCID: PMC8497750 DOI: 10.3389/frobt.2021.700465] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/26/2021] [Accepted: 08/31/2021] [Indexed: 11/16/2022] Open
Abstract
Robots are an opportunity for interactive and engaging learning activities. In this paper we consider the premise that haptic force feedback delivered through a held robot can enrich learning of science-related concepts by building physical intuition as learners design experiments and physically explore them to solve problems they have posed. Further, we conjecture that combining this rich feedback with pen-and-paper interactions, e.g., to sketch experiments they want to try, could lead to fluid interactions and benefit focus. However, a number of technical barriers interfere with testing this approach, and making it accessible to learners and their teachers. In this paper, we propose a framework for Physically Assisted Learning based on stages of experiential learning which can guide designers in developing and evaluating effective technology, and which directs focus on how haptic feedback could assist with design and explore learning stages. To this end, we demonstrated a possible technical pathway to support the full experience of designing an experiment by drawing a physical system on paper, then interacting with it physically after the system recognizes the sketch, interprets as a model and renders it haptically. Our proposed framework is rooted in theoretical needs and current advances for experiential learning, pen-paper interaction and haptic technology. We further explain how to instantiate the PAL framework using available technologies and discuss a path forward to a larger vision of physically assisted learning.
Collapse
Affiliation(s)
- Soheil Kianzad
- SPIN Lab, Department of Computer Science, University of British Columbia, Vancouver, BC, Canada
| | - Guanxiong Chen
- Department of Electrical and Computer Engineering, University of British Columbia, Vancouver, BC, Canada
| | - Karon E MacLean
- SPIN Lab, Department of Computer Science, University of British Columbia, Vancouver, BC, Canada
| |
Collapse
|