1
|
Active object perception using Bayesian classifiers and haptic exploration. Auton Robots 2022. [DOI: 10.1007/s10514-022-10065-6] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
|
2
|
Zhang P, Yu G, Shan D, Chen Z, Wang X. Identifying the Strength Level of Objects' Tactile Attributes Using a Multi-Scale Convolutional Neural Network. SENSORS (BASEL, SWITZERLAND) 2022; 22:1908. [PMID: 35271055 PMCID: PMC8914820 DOI: 10.3390/s22051908] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 01/26/2022] [Revised: 02/20/2022] [Accepted: 02/21/2022] [Indexed: 06/14/2023]
Abstract
In order to solve the problem in which most currently existing research focuses on the binary tactile attributes of objects and ignores identifying the strength level of tactile attributes, this paper establishes a tactile data set of the strength level of objects' elasticity and hardness attributes to make up for the lack of relevant data, and proposes a multi-scale convolutional neural network to identify the strength level of object attributes. The network recognizes the different attributes and identifies differences in the strength level of the same object attributes by fusing the original features, i.e., the single-channel features and multi-channel features of the data. A variety of evaluation methods were used for comparison with multiple models in terms of strength levels of elasticity and hardness. The results show that our network has a more significant effect in accuracy. In the prediction results of the positive examples in the predicted value, the true value has a higher proportion of positive examples, that is, the precision is better. The prediction effect for the positive examples in the true value is better, that is, the recall is better. Finally, the recognition rate for all classes is higher in terms of f1_score. For the overall sample, the prediction of the multi-scale convolutional neural network has a higher recognition rate and the network's ability to recognize each strength level is more stable.
Collapse
Affiliation(s)
- Peng Zhang
- School of Electrical Engineering and Automation, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China;
| | - Guoqi Yu
- School of Mechanical Engineering, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China; (G.Y.); (D.S.)
| | - Dongri Shan
- School of Mechanical Engineering, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China; (G.Y.); (D.S.)
| | - Zhenxue Chen
- School of Control Science and Engineering, Shandong University, Jinan 250061, China;
| | - Xiaofang Wang
- School of Electrical Engineering and Automation, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China;
| |
Collapse
|
3
|
Liu H, Guo D, Sun F, Yang W, Furber S, Sun T. Embodied tactile perception and learning. BRAIN SCIENCE ADVANCES 2020. [DOI: 10.26599/bsa.2020.9050012] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022] Open
Abstract
Various living creatures exhibit embodiment intelligence, which is reflected by a collaborative interaction of the brain, body, and environment. The actual behavior of embodiment intelligence is generated by a continuous and dynamic interaction between a subject and the environment through information perception and physical manipulation. The physical interaction between a robot and the environment is the basis for realizing embodied perception and learning. Tactile information plays a critical role in this physical interaction process. It can be used to ensure safety, stability, and compliance, and can provide unique information that is difficult to capture using other perception modalities. However, due to the limitations of existing sensors and perception and learning methods, the development of robotic tactile research lags significantly behind other sensing modalities, such as vision and hearing, thereby seriously restricting the development of robotic embodiment intelligence. This paper presents the current challenges related to robotic tactile embodiment intelligence and reviews the theory and methods of robotic embodied tactile intelligence. Tactile perception and learning methods for embodiment intelligence can be designed based on the development of new large‐scale tactile array sensing devices, with the aim to make breakthroughs in the neuromorphic computing technology of tactile intelligence.
Collapse
Affiliation(s)
- Huaping Liu
- Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China
| | - Di Guo
- Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China
| | - Fuchun Sun
- Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China
| | - Wuqiang Yang
- Department of Electrical and Electronic Engineering, The University of Manchester, Manchester M13 9 PL, U.K
| | - Steve Furber
- Department of Computer Science, The University of Manchester, Manchester M13 9 PL, U.K
| | - Tengchen Sun
- Beijing Tashan Technology Co., Ltd., Beijing 102300, China
| |
Collapse
|
4
|
Lee JI, Lee S, Oh HM, Cho BR, Seo KH, Kim MY. 3D Contact Position Estimation of Image-Based Areal Soft Tactile Sensor with Printed Array Markers and Image Sensors. SENSORS 2020; 20:s20133796. [PMID: 32645894 PMCID: PMC7374373 DOI: 10.3390/s20133796] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/09/2020] [Revised: 06/29/2020] [Accepted: 07/03/2020] [Indexed: 11/18/2022]
Abstract
Tactile sensors have been widely used and researched in various fields of medical and industrial applications. Gradually, they will be used as new input devices and contact sensors for interactive robots. If a tactile sensor is to be applied to various forms of human–machine interactions, it needs to be soft to ensure comfort and safety, and it should be easily customizable and inexpensive. The purpose of this study is to estimate 3D contact position of a novel image-based areal soft tactile sensor (IASTS) using printed array markers and multiple cameras. First, we introduce the hardware structure of the prototype IASTS, which consists of a soft material with printed array markers and multiple cameras with LEDs. Second, an estimation algorithm for the contact position is proposed based on the image processing of the array markers and their Gaussian fittings. A series of basic experiments was conducted and their results were analyzed to verify the effectiveness of the proposed IASTS hardware and its estimation software. To ensure the stability of the estimated contact positions a Kalman filter was developed. Finally, it was shown that the contact positions on the IASTS were estimated with a reasonable error value for soft haptic applications.
Collapse
Affiliation(s)
- Jong-il Lee
- HRI (Human Robot Interaction) Research Center, Korea Institute of Robotics and Technology Convergence, Pohang-si, Gyeongsangbuk-do 37553, Korea; (J.-i.L.); (K.-H.S.)
- School of Future Automotive & IT Convergence, Kyungpook National University, Daegu 41566, Korea
| | - Suwoong Lee
- Safety System R&D Group, Korea Institute of Industrial Technology, Daegu 42994, Korea; (S.L.); (B.R.C.)
| | - Hyun-Min Oh
- School of Electronics Engineering, Kyungpook National University, Daegu 41566, Korea;
| | - Bo Ram Cho
- Safety System R&D Group, Korea Institute of Industrial Technology, Daegu 42994, Korea; (S.L.); (B.R.C.)
| | - Kap-Ho Seo
- HRI (Human Robot Interaction) Research Center, Korea Institute of Robotics and Technology Convergence, Pohang-si, Gyeongsangbuk-do 37553, Korea; (J.-i.L.); (K.-H.S.)
- Department of Mechanical Engineering, Pohang University of Science and Technology, Pohang-si, Gyeongsangbuk-do 37673, Korea
| | - Min Young Kim
- School of Electronics Engineering, Kyungpook National University, Daegu 41566, Korea;
- Research Center for Neurosurgical Robotic System, Kyungpook National University, Daegu 41566, Korea
- Correspondence: ; Tel.: +82-53-950-7233
| |
Collapse
|
5
|
Elshalakani M, Muthuramalingam M, Bruecker C. A Deep-Learning Model for Underwater Position Sensing of a Wake's Source Using Artificial Seal Whiskers. SENSORS (BASEL, SWITZERLAND) 2020; 20:s20123522. [PMID: 32580301 PMCID: PMC7349333 DOI: 10.3390/s20123522] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 05/15/2020] [Revised: 06/16/2020] [Accepted: 06/19/2020] [Indexed: 06/11/2023]
Abstract
Various marine animals possess the ability to track their preys and navigate dark aquatic environments using hydrodynamic sensing of the surrounding flow. In the present study, a deep-learning model is applied to a biomimetic sensor for underwater position detection of a wake-generating body. The sensor is composed of a bundle of spatially-distributed optical fibers that act as artificial seal-like whiskers and interact with the body's wake in the form of time-variant (bending) deflections. Supervised learning is employed to relate the vibrations of the artificial whiskers to the position of an upstream cylinder. The labeled training data are prepared based on the processing and reduction of the recorded bending responses of the artificial whiskers while the cylinder is placed at various locations. An iterative training algorithm is performed on two neural-network models while using the 10-fold cross-validation technique. The models are able to predict the coordinates of the cylinder in the two-dimensional (2D) space with a high degree of accuracy. The current implementation of the sensor can passively sense the wake generated by the cylinder at Re ≃ 6000 and estimate its position with an average error smaller than the characteristic diameter D of the cylinder and for inter-distances (in the water tunnel) up to 25-times D.
Collapse
|
6
|
Using 3D Convolutional Neural Networks for Tactile Object Recognition with Robotic Palpation. SENSORS 2019; 19:s19245356. [PMID: 31817320 PMCID: PMC6960774 DOI: 10.3390/s19245356] [Citation(s) in RCA: 23] [Impact Index Per Article: 4.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/31/2019] [Revised: 11/29/2019] [Accepted: 12/02/2019] [Indexed: 01/08/2023]
Abstract
In this paper, a novel method of active tactile perception based on 3D neural networks and a high-resolution tactile sensor installed on a robot gripper is presented. A haptic exploratory procedure based on robotic palpation is performed to get pressure images at different grasping forces that provide information not only about the external shape of the object, but also about its internal features. The gripper consists of two underactuated fingers with a tactile sensor array in the thumb. A new representation of tactile information as 3D tactile tensors is described. During a squeeze-and-release process, the pressure images read from the tactile sensor are concatenated forming a tensor that contains information about the variation of pressure matrices along with the grasping forces. These tensors are used to feed a 3D Convolutional Neural Network (3D CNN) called 3D TactNet, which is able to classify the grasped object through active interaction. Results show that 3D CNN performs better, and provide better recognition rates with a lower number of training data.
Collapse
|
7
|
Polic M, Krajacic I, Lepora N, Orsag M. Convolutional Autoencoder for Feature Extraction in Tactile Sensing. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2927950] [Citation(s) in RCA: 28] [Impact Index Per Article: 5.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
8
|
Lepora NF, Church A, de Kerckhove C, Hadsell R, Lloyd J. From Pixels to Percepts: Highly Robust Edge Perception and Contour Following Using Deep Learning and an Optical Biomimetic Tactile Sensor. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2899192] [Citation(s) in RCA: 57] [Impact Index Per Article: 11.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
9
|
Pestell N, Lloyd J, Rossiter J, Lepora NF. Dual-Modal Tactile Perception and Exploration. IEEE Robot Autom Lett 2018. [DOI: 10.1109/lra.2018.2794609] [Citation(s) in RCA: 18] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
10
|
Bernth JE, Ho VA, Liu H. Morphological computation in haptic sensation and interaction: from nature to robotics. Adv Robot 2018. [DOI: 10.1080/01691864.2018.1447393] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Affiliation(s)
| | - Van Anh Ho
- School of Materials Science, Japan Advanced Institute of Science and Technology (JAIST), Nomi, Japan
| | - Hongbin Liu
- Department of Informatics, King’s College London, London, UK
| |
Collapse
|
11
|
Ward-Cherrier B, Rojas N, Lepora NF. Model-Free Precise in-Hand Manipulation with a 3D-Printed Tactile Gripper. IEEE Robot Autom Lett 2017. [DOI: 10.1109/lra.2017.2719761] [Citation(s) in RCA: 37] [Impact Index Per Article: 5.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
12
|
Cramphorn L, Ward-Cherrier B, Lepora NF. Addition of a Biomimetic Fingerprint on an Artificial Fingertip Enhances Tactile Spatial Acuity. IEEE Robot Autom Lett 2017. [DOI: 10.1109/lra.2017.2665690] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
13
|
|
14
|
Lepora NF, Aquilina K, Cramphorn L. Exploratory Tactile Servoing With Active Touch. IEEE Robot Autom Lett 2017. [DOI: 10.1109/lra.2017.2662071] [Citation(s) in RCA: 54] [Impact Index Per Article: 7.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|