1
|
Cui S, Wang S, Wang R, Zhang S, Zhang C. Learning-Based Slip Detection for Dexterous Manipulation Using GelStereo Sensing. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS 2024; 35:13691-13700. [PMID: 37224363 DOI: 10.1109/tnnls.2023.3270579] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 05/26/2023]
Abstract
Endowing the robot with tactile perception can effectively improve manipulation dexterity, along with various benefits of human-like touch. Using GelStereo (GS) tactile sensing, which gives high-resolution contact geometry information, including 2-D displacement field, and 3-D point cloud of the contact surface, we present a learning-based slip detection system in this study. The results reveal that the well-trained network achieves 95.79% accuracy on the never-seen testing dataset, which surpasses the current model-based and learning-based methods using visuotactile sensing. We also propose a general framework for slip feedback adaptive control for dexterous robot manipulation tasks. The experimental results show the effectiveness and efficiency of the proposed control framework using GS tactile feedback when deployed on real-world grasping and screwing manipulation tasks on various robot setups.
Collapse
|
2
|
Fang Y, Zhang X, Xu W, Liu G, Zhao J. Bidirectional visual-tactile cross-modal generation using latent feature space flow model. Neural Netw 2024; 172:106088. [PMID: 38159510 DOI: 10.1016/j.neunet.2023.12.042] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/27/2023] [Revised: 12/09/2023] [Accepted: 12/22/2023] [Indexed: 01/03/2024]
Abstract
Inspired by visual-tactile cross-modal bidirectional mapping of the human brain, this paper introduces a novel approach to bidirectional mapping between visual and tactile data, an area not fully explored in the predominantly unidirectional existing studies. First, we adopt separate Variational AutoEncoder (VAE) models for visual and tactile data. Furthermore, we introduce a conditional flow model built on the VAE latent feature space, enabling cross-modal bidirectional mapping between visual and tactile data using one model. The experimental results show that our method achieves excellent performance in terms of the similarity between the generated data and the original data (Structural Similarity Index (SSIM) of visual data: 0.58, SSIM of tactile data: 0.80), the classification accuracy on generated data (visual data: 91.60%, tactile data: 88.05%), and the zero-shot classification accuracy between generated data and language (visual data: 44.49%, tactile data: 45.03%). To the best of our knowledge, the method proposed in this paper is the first one to utilize a single model to achieve bidirectional mapping between visual and tactile data. Our model and code will be made public after the acceptance of the paper.
Collapse
Affiliation(s)
- Yu Fang
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, No. 2, Yikuang Street, Nangang District, Harbin, 150001, Heilongjiang, China
| | - Xuehe Zhang
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, No. 2, Yikuang Street, Nangang District, Harbin, 150001, Heilongjiang, China.
| | - Wenqiang Xu
- Department of Computer Science and Engineering, Shanghai Jiao Tong University, No. 800 Dongchuan Road, Minhang District, Shanghai, 200240, China
| | - Gangfeng Liu
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, No. 2, Yikuang Street, Nangang District, Harbin, 150001, Heilongjiang, China
| | - Jie Zhao
- State Key Laboratory of Robotics and System, Harbin Institute of Technology, No. 2, Yikuang Street, Nangang District, Harbin, 150001, Heilongjiang, China
| |
Collapse
|
3
|
Navarro-Guerrero N, Toprak S, Josifovski J, Jamone L. Visuo-haptic object perception for robots: an overview. Auton Robots 2023. [DOI: 10.1007/s10514-023-10091-y] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 03/28/2023]
Abstract
AbstractThe object perception capabilities of humans are impressive, and this becomes even more evident when trying to develop solutions with a similar proficiency in autonomous robots. While there have been notable advancements in the technologies for artificial vision and touch, the effective integration of these two sensory modalities in robotic applications still needs to be improved, and several open challenges exist. Taking inspiration from how humans combine visual and haptic perception to perceive object properties and drive the execution of manual tasks, this article summarises the current state of the art of visuo-haptic object perception in robots. Firstly, the biological basis of human multimodal object perception is outlined. Then, the latest advances in sensing technologies and data collection strategies for robots are discussed. Next, an overview of the main computational techniques is presented, highlighting the main challenges of multimodal machine learning and presenting a few representative articles in the areas of robotic object recognition, peripersonal space representation and manipulation. Finally, informed by the latest advancements and open challenges, this article outlines promising new research directions.
Collapse
|
4
|
Anandan N, Arronde Pérez D, Mitterer T, Zangl H. Design and Evaluation of Capacitive Smart Transducer for a Forestry Crane Gripper. SENSORS (BASEL, SWITZERLAND) 2023; 23:2747. [PMID: 36904949 PMCID: PMC10007621 DOI: 10.3390/s23052747] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/15/2023] [Revised: 02/24/2023] [Accepted: 02/27/2023] [Indexed: 06/18/2023]
Abstract
Stable grasps are essential for robots handling objects. This is especially true for "robotized" large industrial machines as heavy and bulky objects that are unintentionally dropped by the machine can lead to substantial damages and pose a significant safety risk. Consequently, adding a proximity and tactile sensing to such large industrial machinery can help to mitigate this problem. In this paper, we present a sensing system for proximity/tactile sensing in gripper claws of a forestry crane. In order to avoid difficulties with respect to the installation of cables (in particular in retrofitting of existing machinery), the sensors are truly wireless and can be powered using energy harvesting, leading to autarkic, i.e., self-contained, sensors. The sensing elements are connected to a measurement system which transmits the measurement data to the crane automation computer via Bluetooth low energy (BLE) compliant to IEEE 1451.0 (TEDs) specification for eased logical system integration. We demonstrate that the sensor system can be fully integrated in the grasper and that it can withstand the challenging environmental conditions. We present experimental evaluation of detection in various grasping scenarios such as grasping at an angle, corner grasping, improper closure of the gripper and proper grasp for logs of three different sizes. Results indicate the ability to detect and differentiate between good and poor grasping configurations.
Collapse
|
5
|
Kadalagere Sampath S, Wang N, Wu H, Yang C. Review on human‐like robot manipulation using dexterous hands. COGNITIVE COMPUTATION AND SYSTEMS 2023. [DOI: 10.1049/ccs2.12073] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 02/20/2023] Open
Affiliation(s)
| | - Ning Wang
- Bristol Robotics Laboratory (BRL) University of the West of England Bristol UK
| | | | - Chenguang Yang
- Bristol Robotics Laboratory (BRL) University of the West of England Bristol UK
| |
Collapse
|
6
|
Yu S, Zhai DH, Xia Y, Wu H, Liao J. SE-ResUNet: A Novel Robotic Grasp Detection Method. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3145064] [Citation(s) in RCA: 4] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|