1
|
Frediani G, Carpi F. How to Make the Skin Contact Area Controllable by Optical Calibration in Wearable Tactile Displays of Softness. SENSORS (BASEL, SWITZERLAND) 2024; 24:6770. [PMID: 39460250 PMCID: PMC11511068 DOI: 10.3390/s24206770] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Received: 09/27/2024] [Revised: 10/17/2024] [Accepted: 10/19/2024] [Indexed: 10/28/2024]
Abstract
Virtual reality systems may benefit from wearable (fingertip-mounted) haptic displays capable of rendering the softness of virtual objects. According to neurophysiological evidence, the easiest reliable way to render a virtual softness is to generate purely tactile (as opposed to kinaesthetic) feedback to be delivered via a finger-pulp-interfaced deformable surface. Moreover, it is necessary to control not only the skin indentation depth by applying quasi-static (non-vibratory) contact pressures, but also the skin contact area. This is typically impossible with available devices, even with those that can vary the contact area, because the latter cannot be controlled due to the complexity of sensing it at high resolutions. This causes indetermination on an important tactile cue to render softness. Here, we present a technology that allows the contact area to be open-loop controlled via personalised optical calibrations. We demonstrate the solution on a modified, pneumatic wearable tactile display of softness previously described by us, consisting of a small chamber containing a transparent membrane inflated against the finger pulp. A window on the device allowed for monitoring the skin contact area with a camera from an external unit to generate a calibration curve by processing photos of the skin membrane interface at different pressures. The solution was validated by comparisons with an ink-stain-based method. Moreover, to avoid manual calibrations, a preliminary automated procedure was developed. This calibration strategy may be applied also to other kinds of displays where finger pulps are in contact with transparent deformable structures.
Collapse
Affiliation(s)
- Gabriele Frediani
- Biomedical Engineering Unit, Department of Industrial Engineering, University of Florence, 50121 Florence, Italy;
| | - Federico Carpi
- Biomedical Engineering Unit, Department of Industrial Engineering, University of Florence, 50121 Florence, Italy;
- IRCCS Fondazione don Carlo Gnocchi ONLUS, 50143 Florence, Italy
| |
Collapse
|
2
|
Zhao Z, Zheng D, Chen L. Detecting Transitions from Stability to Instability in Robotic Grasping Based on Tactile Perception. SENSORS (BASEL, SWITZERLAND) 2024; 24:5080. [PMID: 39124127 PMCID: PMC11314830 DOI: 10.3390/s24155080] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 07/20/2024] [Revised: 07/31/2024] [Accepted: 08/02/2024] [Indexed: 08/12/2024]
Abstract
Robots execute diverse load operations, including carrying, lifting, tilting, and moving objects, involving load changes or transfers. This dynamic process can result in the shift of interactive operations from stability to instability. In this paper, we respond to these dynamic changes by utilizing tactile images captured from tactile sensors during interactions, conducting a study on the dynamic stability and instability in operations, and propose a real-time dynamic state sensing network by integrating convolutional neural networks (CNNs) for spatial feature extraction and long short-term memory (LSTM) networks to capture temporal information. We collect a dataset capturing the entire transition from stable to unstable states during interaction. Employing a sliding window, we sample consecutive frames from the collected dataset and feed them into the network for the state change predictions of robots. The network achieves both real-time temporal sequence prediction at 31.84 ms per inference step and an average classification accuracy of 98.90%. Our experiments demonstrate the network's robustness, maintaining high accuracy even with previously unseen objects.
Collapse
Affiliation(s)
- Zhou Zhao
- School of Computer Science, Central China Normal University, Wuhan 430079, China; (Z.Z.); (D.Z.)
- Hubei Engineering Research Center for Intelligent Detection and Identification of Complex Parts, Wuhan 430079, China
| | - Dongyuan Zheng
- School of Computer Science, Central China Normal University, Wuhan 430079, China; (Z.Z.); (D.Z.)
| | - Lu Chen
- Institute of Big Data Science and Industry, School of Computer and Information Technology, Shanxi University, Taiyuan 030006, China
| |
Collapse
|
3
|
Mandil W, Rajendran V, Nazari K, Ghalamzan-Esfahani A. Tactile-Sensing Technologies: Trends, Challenges and Outlook in Agri-Food Manipulation. SENSORS (BASEL, SWITZERLAND) 2023; 23:7362. [PMID: 37687818 PMCID: PMC10490130 DOI: 10.3390/s23177362] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/13/2023] [Revised: 08/01/2023] [Accepted: 08/15/2023] [Indexed: 09/10/2023]
Abstract
Tactile sensing plays a pivotal role in achieving precise physical manipulation tasks and extracting vital physical features. This comprehensive review paper presents an in-depth overview of the growing research on tactile-sensing technologies, encompassing state-of-the-art techniques, future prospects, and current limitations. The paper focuses on tactile hardware, algorithmic complexities, and the distinct features offered by each sensor. This paper has a special emphasis on agri-food manipulation and relevant tactile-sensing technologies. It highlights key areas in agri-food manipulation, including robotic harvesting, food item manipulation, and feature evaluation, such as fruit ripeness assessment, along with the emerging field of kitchen robotics. Through this interdisciplinary exploration, we aim to inspire researchers, engineers, and practitioners to harness the power of tactile-sensing technology for transformative advancements in agri-food robotics. By providing a comprehensive understanding of the current landscape and future prospects, this review paper serves as a valuable resource for driving progress in the field of tactile sensing and its application in agri-food systems.
Collapse
Affiliation(s)
- Willow Mandil
- School of Computer Science, University of Lincoln, Lincoln LN6 7TS, UK
| | - Vishnu Rajendran
- Lincoln Institute for Agri-Food Technology, University of Lincoln, Lincoln LN6 7TS, UK
| | - Kiyanoush Nazari
- School of Computer Science, University of Lincoln, Lincoln LN6 7TS, UK
| | | |
Collapse
|
4
|
Sajwani H, Ayyad A, Alkendi Y, Halwani M, Abdulrahman Y, Abusafieh A, Zweiri Y. TactiGraph: An Asynchronous Graph Neural Network for Contact Angle Prediction Using Neuromorphic Vision-Based Tactile Sensing. SENSORS (BASEL, SWITZERLAND) 2023; 23:6451. [PMID: 37514745 PMCID: PMC10383597 DOI: 10.3390/s23146451] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 05/06/2023] [Revised: 06/02/2023] [Accepted: 06/06/2023] [Indexed: 07/30/2023]
Abstract
Vision-based tactile sensors (VBTSs) have become the de facto method for giving robots the ability to obtain tactile feedback from their environment. Unlike other solutions to tactile sensing, VBTSs offer high spatial resolution feedback without compromising on instrumentation costs or incurring additional maintenance expenses. However, conventional cameras used in VBTS have a fixed update rate and output redundant data, leading to computational overhead.In this work, we present a neuromorphic vision-based tactile sensor (N-VBTS) that employs observations from an event-based camera for contact angle prediction. In particular, we design and develop a novel graph neural network, dubbed TactiGraph, that asynchronously operates on graphs constructed from raw N-VBTS streams exploiting their spatiotemporal correlations to perform predictions. Although conventional VBTSs use an internal illumination source, TactiGraph is reported to perform efficiently in both scenarios (with and without an internal illumination source) thus further reducing instrumentation costs. Rigorous experimental results revealed that TactiGraph achieved a mean absolute error of 0.62∘ in predicting the contact angle and was faster and more efficient than both conventional VBTS and other N-VBTS, with lower instrumentation costs. Specifically, N-VBTS requires only 5.5% of the computing time needed by VBTS when both are tested on the same scenario.
Collapse
Affiliation(s)
- Hussain Sajwani
- UAE National Service & Reserve Authority, Abu Dhabi, United Arab Emirates
- Advanced Research and Innovation Center (ARIC), Khalifa University, Abu Dhabi 127788, United Arab Emirates
| | - Abdulla Ayyad
- Advanced Research and Innovation Center (ARIC), Khalifa University, Abu Dhabi 127788, United Arab Emirates
| | - Yusra Alkendi
- Department of Aerospace Engineering, Khalifa University, Abu Dhabi 127788, United Arab Emirates
| | - Mohamad Halwani
- Advanced Research and Innovation Center (ARIC), Khalifa University, Abu Dhabi 127788, United Arab Emirates
| | - Yusra Abdulrahman
- Advanced Research and Innovation Center (ARIC), Khalifa University, Abu Dhabi 127788, United Arab Emirates
- Department of Aerospace Engineering, Khalifa University, Abu Dhabi 127788, United Arab Emirates
| | - Abdulqader Abusafieh
- Advanced Research and Innovation Center (ARIC), Khalifa University, Abu Dhabi 127788, United Arab Emirates
- Research and Development, Strata Manufacturing PJSC, Al Ain 86519, United Arab Emirates
| | - Yahya Zweiri
- Advanced Research and Innovation Center (ARIC), Khalifa University, Abu Dhabi 127788, United Arab Emirates
- Department of Aerospace Engineering, Khalifa University, Abu Dhabi 127788, United Arab Emirates
| |
Collapse
|
5
|
Breuss A, Sferrazza C, Pleisch J, D'Andrea R, Riener R. Unobtrusive Sleep Position Classification Using a Novel Optical Tactile Sensor. ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY. ANNUAL INTERNATIONAL CONFERENCE 2023; 2023:1-5. [PMID: 38083698 DOI: 10.1109/embc40787.2023.10340645] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 12/18/2023]
Abstract
Unobtrusive sleep position classification is essential for sleep monitoring and closed-loop intervention systems that initiate position changes. In this paper, we present a novel unobtrusive under-mattress optical tactile sensor for sleep position classification. The sensor uses a camera to track particles embedded in a soft silicone layer, inferring the deformation of the silicone and therefore providing information about the pressure and shear distributions applied to its surface.We characterized the sensitivity of the sensor after placing it under a conventional mattress and applying different weights (258 g, 500 g, 5000 g) on top of the mattress in various predefined locations. Moreover, we collected multiple recordings from a person lying in supine, lateral left, lateral right, and prone positions. As a proof-of-concept, we trained a neural network based on convolutional layers and residual blocks that classified the lying positions based on the images from the tactile sensor.We observed a high sensitivity of the optical tactile sensor: Even after placing the sensor below a conventional mattress, we were able to detect our lowest test weight of 258 g. Using the neural network, we were able to classify the four sleep positions, lateral left, lateral right, prone, and supine with a classification accuracy of 91.2 %.The high sensitivity of the sensor, as well as the good performance in the classification task, demonstrate the feasibility of using such a sensor in a robotic bed setup.Clinical Relevance- Positional Obstructive Sleep Apnea is highly prevalent across the general population. Today's gold standard treatment of using CPAP ventilation is often not accepted, leading to unwanted treatment cessations. Alternative treatments, such as positional interventions through robotic beds are highly promising. However, these beds require reliable detection of the lying position. In this paper, we present a novel, scalable and completely unobtrusive sensor that is concealed under the mattress while classifying sleeping positions with high accuracy.
Collapse
|
6
|
Oya R, Sawada H. An SMA Transducer for Sensing Tactile Sensation Focusing on Stroking Motion. MATERIALS (BASEL, SWITZERLAND) 2023; 16:1016. [PMID: 36770021 PMCID: PMC9920712 DOI: 10.3390/ma16031016] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 12/18/2022] [Revised: 01/17/2023] [Accepted: 01/18/2023] [Indexed: 06/18/2023]
Abstract
The authors have developed a micro-vibration actuator using filiform SMA wire electrically driven by periodic electric current. While applying the SMA actuators to tactile displays, we discovered a phenomenon that the deformation caused by a given stress to an SMA wire generated a change in the electrical resistance. With this characteristic, the SMA wire works as a micro-force sensor with high sensitivity, while generating micro-vibration. In this paper, the micro-force sensing ability of an SMA transducer is described and discussed. Experiments are conducted by sliding the SMA sensor on the surface of different objects with different speeds, and the sensing ability is evaluated to be related with human tactile sensation.
Collapse
Affiliation(s)
- Ryusei Oya
- Graduate School of Advanced Science and Engineering, Waseda University, Tokyo 169-8555, Japan
| | - Hideyuki Sawada
- Faculty of Science and Engineering, Waseda University, Tokyo 169-8555, Japan
| |
Collapse
|
7
|
Zaid IM, Halwani M, Ayyad A, Imam A, Almaskari F, Hassanin H, Zweiri Y. Elastomer-Based Visuotactile Sensor for Normality of Robotic Manufacturing Systems. Polymers (Basel) 2022; 14:polym14235097. [PMID: 36501492 PMCID: PMC9735518 DOI: 10.3390/polym14235097] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/07/2022] [Revised: 11/19/2022] [Accepted: 11/20/2022] [Indexed: 11/25/2022] Open
Abstract
Modern aircrafts require the assembly of thousands of components with high accuracy and reliability. The normality of drilled holes is a critical geometrical tolerance that is required to be achieved in order to realize an efficient assembly process. Failure to achieve the required tolerance leads to structures prone to fatigue problems and assembly errors. Elastomer-based tactile sensors have been used to support robots in acquiring useful physical interaction information with the environments. However, current tactile sensors have not yet been developed to support robotic machining in achieving the tight tolerances of aerospace structures. In this paper, a novel elastomer-based tactile sensor was developed for cobot machining. Three commercial silicon-based elastomer materials were characterised using mechanical testing in order to select a material with the best deformability. A Finite element model was developed to simulate the deformation of the tactile sensor upon interacting with surfaces with different normalities. Additive manufacturing was employed to fabricate the tactile sensor mould, which was chemically etched to improve the surface quality. The tactile sensor was obtained by directly casting and curing the optimum elastomer material onto the additively manufactured mould. A machine learning approach was used to train the simulated and experimental data obtained from the sensor. The capability of the developed vision tactile sensor was evaluated using real-world experiments with various inclination angles, and achieved a mean perpendicularity tolerance of 0.34°. The developed sensor opens a new perspective on low-cost precision cobot machining.
Collapse
Affiliation(s)
- Islam Mohamed Zaid
- Advanced Research and Innovation Center, Khalifa University, Abu Dhabi 127788, United Arab Emirates
- Department of Aerospace Engineering, Khalifa University of Science and Technology, Abu Dhabi 127788, United Arab Emirates
| | - Mohamad Halwani
- Advanced Research and Innovation Center, Khalifa University, Abu Dhabi 127788, United Arab Emirates
| | - Abdulla Ayyad
- Advanced Research and Innovation Center, Khalifa University, Abu Dhabi 127788, United Arab Emirates
| | - Adil Imam
- School of Engineering, Technology, and Design, Canterbury Christ Church University, Canterbury CT1 1QU, UK
| | - Fahad Almaskari
- Department of Aerospace Engineering, Khalifa University of Science and Technology, Abu Dhabi 127788, United Arab Emirates
| | - Hany Hassanin
- School of Engineering, Technology, and Design, Canterbury Christ Church University, Canterbury CT1 1QU, UK
- Correspondence:
| | - Yahya Zweiri
- Advanced Research and Innovation Center, Khalifa University, Abu Dhabi 127788, United Arab Emirates
- Department of Aerospace Engineering, Khalifa University of Science and Technology, Abu Dhabi 127788, United Arab Emirates
| |
Collapse
|
8
|
Bayer IS. MEMS-Based Tactile Sensors: Materials, Processes and Applications in Robotics. MICROMACHINES 2022; 13:2051. [PMID: 36557349 PMCID: PMC9782357 DOI: 10.3390/mi13122051] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 10/24/2022] [Revised: 11/14/2022] [Accepted: 11/17/2022] [Indexed: 06/17/2023]
Abstract
Commonly encountered problems in the manipulation of objects with robotic hands are the contact force control and the setting of approaching motion. Microelectromechanical systems (MEMS) sensors on robots offer several solutions to these problems along with new capabilities. In this review, we analyze tactile, force and/or pressure sensors produced by MEMS technologies including off-the-shelf products such as MEMS barometric sensors. Alone or in conjunction with other sensors, MEMS platforms are considered very promising for robots to detect the contact forces, slippage and the distance to the objects for effective dexterous manipulation. We briefly reviewed several sensing mechanisms and principles, such as capacitive, resistive, piezoresistive and triboelectric, combined with new flexible materials technologies including polymers processing and MEMS-embedded textiles for flexible and snake robots. We demonstrated that without taking up extra space and at the same time remaining lightweight, several MEMS sensors can be integrated into robotic hands to simulate human fingers, gripping, hardness and stiffness sensations. MEMS have high potential of enabling new generation microactuators, microsensors, micro miniature motion-systems (e.g., microrobots) that will be indispensable for health, security, safety and environmental protection.
Collapse
Affiliation(s)
- Ilker S Bayer
- Smart Materials, Istituto Italiano di Tecnologia, Via Morego 30, 16163 Genova, Italy
| |
Collapse
|
9
|
Lepora NF, Lin Y, Money-Coomes B, Lloyd J. DigiTac: A DIGIT-TacTip Hybrid Tactile Sensor for Comparing Low-Cost High-Resolution Robot Touch. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3190641] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
Affiliation(s)
- Nathan F. Lepora
- Department of Engineering Mathematics and Bristol Robotics Laboratory, University of Bristol, Bristol, U.K
| | - Yijiong Lin
- Department of Engineering Mathematics and Bristol Robotics Laboratory, University of Bristol, Bristol, U.K
| | - Ben Money-Coomes
- Department of Engineering Mathematics and Bristol Robotics Laboratory, University of Bristol, Bristol, U.K
| | - John Lloyd
- Department of Engineering Mathematics and Bristol Robotics Laboratory, University of Bristol, Bristol, U.K
| |
Collapse
|
10
|
Zhang G, Du Y, Yu H, Wang MY. DelTact: A Vision-Based Tactile Sensor Using a Dense Color Pattern. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3196141] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Affiliation(s)
- Guanlan Zhang
- Department of Individualized Interdisciplinary Program (ROAS), Hong Kong University of Science and Technology, Hong Kong, China
| | - Yipai Du
- Department of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Hong Kong, China
| | - Hongyu Yu
- Department of Mechanical and Aerospace Engineering, Hong Kong University of Science and Technology, Hong Kong, China
| | - Michael Yu Wang
- Department of Mechanical and Aerospace Engineering, Hong Kong University of Science and Technology, Hong Kong
| |
Collapse
|
11
|
Fastier-Wooller JW, Vu TH, Nguyen H, Nguyen HQ, Rybachuk M, Zhu Y, Dao DV, Dau VT. Multimodal Fibrous Static and Dynamic Tactile Sensor. ACS APPLIED MATERIALS & INTERFACES 2022; 14:27317-27327. [PMID: 35656814 DOI: 10.1021/acsami.2c08195] [Citation(s) in RCA: 5] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
A highly versatile, low-cost, and robust tactile sensor capable of acquiring load measurements under static and dynamic modes employing a poly(vinylidene fluoride-co-trifluoroethylene) [P(VDF-TrFE)] micronanofiber element is presented. The sensor is comprised of three essential layers, a fibrous core P(VDF-TrFE) layer and two Ni/Cu conductive fabric electrode layers, with a total thickness of less than 300 μm. Using an in situ electrospinning process, the core fibers are deposited directly to a soft poly(dimethylsiloxane) (PDMS) fingertip. The core layer conforms to the surface and requires no additional processing, exhibiting the capability of the in situ electrospinning fabrication method to alleviate poor surface contacts and resolve issues associated with adhesion. The fabricated tactile sensor displayed a reliable and consistent measurement performance of static and instantaneous dynamic loads over a total of 30 000 test cycles. The capabilities and implications of the presented tactile sensor design for multimodal sensing in robot tactile sensing applications is further discussed and elucidated.
Collapse
Affiliation(s)
- Jarred W Fastier-Wooller
- School of Engineering and Built Environment, Griffith University, Engineering Drive, Southport 4222, Australia
| | - Trung-Hieu Vu
- School of Engineering and Built Environment, Griffith University, Engineering Drive, Southport 4222, Australia
| | - Hang Nguyen
- University of Engineering and Technology, Vietnam National University, 144 Xuan Thuy, Cau Giay, Hanoi 100000, Vietnam
| | - Hong-Quan Nguyen
- School of Engineering and Built Environment, Griffith University, Engineering Drive, Southport 4222, Australia
| | - Maksym Rybachuk
- School of Engineering and Built Environment, Griffith University, 170 Kessels Road, Nathan 4111, Australia
- Centre for Quantum Dynamics and Australian Attosecond Science Facility, Griffith University, Science Road, Nathan 4111, Australia
| | - Yong Zhu
- School of Engineering and Built Environment, Griffith University, Engineering Drive, Southport 4222, Australia
- Queensland Micro- and Nanotechnology Centre, Griffith University, West Creek Road, Nathan 4111, Australia
| | - Dzung Viet Dao
- School of Engineering and Built Environment, Griffith University, Engineering Drive, Southport 4222, Australia
- Queensland Micro- and Nanotechnology Centre, Griffith University, West Creek Road, Nathan 4111, Australia
| | - Van Thanh Dau
- School of Engineering and Built Environment, Griffith University, Engineering Drive, Southport 4222, Australia
- Centre of Catalysis and Clean Energy, Griffith University, 1 Parklands Drive, Southport 4222, Australia
| |
Collapse
|
12
|
Lin JC, Liatsis P, Alexandridis P. Flexible and Stretchable Electrically Conductive Polymer Materials for Physical Sensing Applications. POLYM REV 2022. [DOI: 10.1080/15583724.2022.2059673] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/18/2022]
Affiliation(s)
- Jui-Chi Lin
- Department of Biomedical Engineering, University at Buffalo, The State University of New York (SUNY), Buffalo, NY, USA
| | - Panos Liatsis
- Department of Electrical Engineering and Computer Science, Khalifa University, Abu Dhabi, UAE
| | - Paschalis Alexandridis
- Department of Biomedical Engineering, University at Buffalo, The State University of New York (SUNY), Buffalo, NY, USA
- Department of Chemical and Biological Engineering, University at Buffalo, The State University of New York (SUNY), Buffalo, NY, USA
| |
Collapse
|
13
|
Sun H, Martius G. Guiding the design of superresolution tactile skins with taxel value isolines theory. Sci Robot 2022; 7:eabm0608. [PMID: 35196071 DOI: 10.1126/scirobotics.abm0608] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/03/2022]
Abstract
Tactile feedback is essential to make robots more agile and effective in unstructured environments. However, high-resolution tactile skins are not widely available; this is due to the large size of robust sensing units and because many units typically lead to fragility in wiring and to high costs. One route toward high-resolution and robust tactile skins involves the embedding of a few sensor units (taxels) into a flexible surface material and the use of signal processing to achieve sensing with superresolution accuracy. Here, we propose a theory for geometric superresolution to guide the development of tactile sensors of this kind and link it to machine learning techniques for signal processing. This theory is based on sensor isolines and allows us to compute the possible force sensitivity and accuracy in contact position and force magnitude as a spatial quantity before building a sensor. We evaluate the influence of different factors, such as elastic properties of the material, structure design, and transduction methods, using finite element simulations and by implementing real sensors. We empirically determine sensor isolines and validate the theory in two custom-built sensors with 1D and 2D measurement surfaces that use barometric units. Using machine learning methods to infer contact information, our sensors obtain an average superresolution factor of over 100 and 1200, respectively. Our theory can guide future tactile sensor designs and inform various design choices.
Collapse
Affiliation(s)
- Huanbo Sun
- Autonomous Learning Group, Max Planck Institute for Intelligent Systems, Tübingen, Germany
| | - Georg Martius
- Autonomous Learning Group, Max Planck Institute for Intelligent Systems, Tübingen, Germany
| |
Collapse
|
14
|
Sun H, Kuchenbecker KJ, Martius G. A soft thumb-sized vision-based sensor with accurate all-round force perception. NAT MACH INTELL 2022. [DOI: 10.1038/s42256-021-00439-3] [Citation(s) in RCA: 10] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Abstract
AbstractVision-based haptic sensors have emerged as a promising approach to robotic touch due to affordable high-resolution cameras and successful computer vision techniques; however, their physical design and the information they provide do not yet meet the requirements of real applications. We present a robust, soft, low-cost, vision-based, thumb-sized three-dimensional haptic sensor named Insight, which continually provides a directional force-distribution map over its entire conical sensing surface. Constructed around an internal monocular camera, the sensor has only a single layer of elastomer over-moulded on a stiff frame to guarantee sensitivity, robustness and soft contact. Furthermore, Insight uniquely combines photometric stereo and structured light using a collimator to detect the three-dimensional deformation of its easily replaceable flexible outer shell. The force information is inferred by a deep neural network that maps images to the spatial distribution of three-dimensional contact force (normal and shear). Insight has an overall spatial resolution of 0.4 mm, a force magnitude accuracy of around 0.03 N and a force direction accuracy of around five degrees over a range of 0.03–2 N for numerous distinct contacts with varying contact area. The presented hardware and software design concepts can be transferred to a wide variety of robot parts.
Collapse
|
15
|
Othman W, Lai ZHA, Abril C, Barajas-Gamboa JS, Corcelles R, Kroh M, Qasaimeh MA. Tactile Sensing for Minimally Invasive Surgery: Conventional Methods and Potential Emerging Tactile Technologies. Front Robot AI 2022; 8:705662. [PMID: 35071332 PMCID: PMC8777132 DOI: 10.3389/frobt.2021.705662] [Citation(s) in RCA: 14] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 05/05/2021] [Accepted: 11/04/2021] [Indexed: 11/13/2022] Open
Abstract
As opposed to open surgery procedures, minimally invasive surgery (MIS) utilizes small skin incisions to insert a camera and surgical instruments. MIS has numerous advantages such as reduced postoperative pain, shorter hospital stay, faster recovery time, and reduced learning curve for surgical trainees. MIS comprises surgical approaches, including laparoscopic surgery, endoscopic surgery, and robotic-assisted surgery. Despite the advantages that MIS provides to patients and surgeons, it remains limited by the lost sense of touch due to the indirect contact with tissues under operation, especially in robotic-assisted surgery. Surgeons, without haptic feedback, could unintentionally apply excessive forces that may cause tissue damage. Therefore, incorporating tactile sensation into MIS tools has become an interesting research topic. Designing, fabricating, and integrating force sensors onto different locations on the surgical tools are currently under development by several companies and research groups. In this context, electrical force sensing modality, including piezoelectric, resistive, and capacitive sensors, is the most conventionally considered approach to measure the grasping force, manipulation force, torque, and tissue compliance. For instance, piezoelectric sensors exhibit high sensitivity and accuracy, but the drawbacks of thermal sensitivity and the inability to detect static loads constrain their adoption in MIS tools. Optical-based tactile sensing is another conventional approach that facilitates electrically passive force sensing compatible with magnetic resonance imaging. Estimations of applied loadings are calculated from the induced changes in the intensity, wavelength, or phase of light transmitted through optical fibers. Nonetheless, new emerging technologies are also evoking a high potential of contributions to the field of smart surgical tools. The recent development of flexible, highly sensitive tactile microfluidic-based sensors has become an emerging field in tactile sensing, which contributed to wearable electronics and smart-skin applications. Another emerging technology is imaging-based tactile sensing that achieved superior multi-axial force measurements by implementing image sensors with high pixel densities and frame rates to track visual changes on a sensing surface. This article aims to review the literature on MIS tactile sensing technologies in terms of working principles, design requirements, and specifications. Moreover, this work highlights and discusses the promising potential of a few emerging technologies towards establishing low-cost, high-performance MIS force sensing.
Collapse
Affiliation(s)
- Wael Othman
- Engineering Division, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
- Mechanical and Aerospace Engineering, New York University, New York, NY, United States
| | - Zhi-Han A. Lai
- Engineering Division, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Carlos Abril
- Digestive Disease Institute, Cleveland Clinic Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Juan S. Barajas-Gamboa
- Digestive Disease Institute, Cleveland Clinic Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Ricard Corcelles
- Digestive Disease and Surgery Institute, Cleveland Clinic Main Campus, Cleveland, OH, United States
- Cleveland Clinic Lerner College of Medicine, Cleveland, OH, United States
| | - Matthew Kroh
- Digestive Disease Institute, Cleveland Clinic Abu Dhabi, Abu Dhabi, United Arab Emirates
| | - Mohammad A. Qasaimeh
- Engineering Division, New York University Abu Dhabi, Abu Dhabi, United Arab Emirates
- Mechanical and Aerospace Engineering, New York University, New York, NY, United States
| |
Collapse
|
16
|
Shimonomura K, Chang T, Murata T. Detection of Foreign Bodies in Soft Foods Employing Tactile Image Sensor. Front Robot AI 2021; 8:774080. [PMID: 34926592 PMCID: PMC8678492 DOI: 10.3389/frobt.2021.774080] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 09/10/2021] [Accepted: 11/16/2021] [Indexed: 11/13/2022] Open
Abstract
In the inspection work involving foodstuffs in food factories, there are cases where people not only visually inspect foodstuffs, but must also physically touch foodstuffs with their hands to find foreign or undesirable objects mixed in the product. To contribute to the automation of the inspection process, this paper proposes a method for detecting foreign objects in food based on differences in hardness using a camera-based tactile image sensor. Because the foreign objects to be detected are often small, the tactile sensor requires a high spatial resolution. In addition, inspection work in food factories requires a sufficient inspection speed. The proposed cylindrical tactile image sensor meets these requirements because it can efficiently acquire high-resolution tactile images with a camera mounted inside while rolling the cylindrical sensor surface over the target object. By analyzing the images obtained from the tactile image sensor, we detected the presence of foreign objects and their locations. By using a reflective membrane-type sensor surface with high sensitivity, small and hard foreign bodies of sub-millimeter size mixed in with soft food were successfully detected. The effectiveness of the proposed method was confirmed through experiments to detect shell fragments left on the surface of raw shrimp and bones left in fish fillets.
Collapse
|
17
|
Sferrazza C, D'Andrea R. Sim-to-Real for High-Resolution Optical Tactile Sensing: From Images to Three-Dimensional Contact Force Distributions. Soft Robot 2021; 9:926-937. [PMID: 34842455 PMCID: PMC9595648 DOI: 10.1089/soro.2020.0213] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.7] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/15/2022] Open
Abstract
The images captured by vision-based tactile sensors carry information about high-resolution tactile fields, such as the distribution of the contact forces applied to their soft sensing surface. However, extracting the information encoded in the images is challenging and often addressed with learning-based approaches, which generally require a large amount of training data. This article proposes a strategy to generate tactile images in simulation for a vision-based tactile sensor based on an internal camera that tracks the motion of spherical particles within a soft material. The deformation of the material is simulated in a finite element environment under a diverse set of contact conditions, and spherical particles are projected to a simulated image. Features extracted from the images are mapped to the three-dimensional contact force distribution, with the ground truth also obtained using finite-element simulations, with an artificial neural network that is therefore entirely trained on synthetic data avoiding the need for real-world data collection. The resulting model exhibits high accuracy when evaluated on real-world tactile images, is transferable across multiple tactile sensors without further training, and is suitable for efficient real-time inference.
Collapse
Affiliation(s)
- Carmelo Sferrazza
- Institute for Dynamic Systems and Control, Department of Mechanical and Process Engineering, ETH Zurich, Zurich, Switzerland
| | - Raffaello D'Andrea
- Institute for Dynamic Systems and Control, Department of Mechanical and Process Engineering, ETH Zurich, Zurich, Switzerland
| |
Collapse
|
18
|
Shah UH, Muthusamy R, Gan D, Zweiri Y, Seneviratne L. On the Design and Development of Vision-based Tactile Sensors. J INTELL ROBOT SYST 2021. [DOI: 10.1007/s10846-021-01431-0] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/20/2022]
|
19
|
Bi T, Sferrazza C, D'Andrea R. Zero-Shot Sim-to-Real Transfer of Tactile Control Policies for Aggressive Swing-Up Manipulation. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3084880] [Citation(s) in RCA: 4] [Impact Index Per Article: 1.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
20
|
Gomes DF, Paoletti P, Luo S. Generation of GelSight Tactile Images for Sim2Real Learning. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3063925] [Citation(s) in RCA: 15] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
21
|
Du Y, Zhang G, Zhang Y, Wang MY. High-Resolution 3-Dimensional Contact Deformation Tracking for FingerVision Sensor With Dense Random Color Pattern. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3061306] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
22
|
Vision-Based Tactile Sensor Mechanism for the Estimation of Contact Position and Force Distribution Using Deep Learning. SENSORS 2021; 21:s21051920. [PMID: 33803481 PMCID: PMC7967204 DOI: 10.3390/s21051920] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/29/2021] [Revised: 03/04/2021] [Accepted: 03/05/2021] [Indexed: 11/16/2022]
Abstract
This work describes the development of a vision-based tactile sensor system that utilizes the image-based information of the tactile sensor in conjunction with input loads at various motions to train the neural network for the estimation of tactile contact position, area, and force distribution. The current study also addresses pragmatic aspects, such as choice of the thickness and materials for the tactile fingertips and surface tendency, etc. The overall vision-based tactile sensor equipment interacts with an actuating motion controller, force gauge, and control PC (personal computer) with a LabVIEW software on it. The image acquisition was carried out using a compact stereo camera setup mounted inside the elastic body to observe and measure the amount of deformation by the motion and input load. The vision-based tactile sensor test bench was employed to collect the output contact position, angle, and force distribution caused by various randomly considered input loads for motion in X, Y, Z directions and RxRy rotational motion. The retrieved image information, contact position, area, and force distribution from different input loads with specified 3D position and angle are utilized for deep learning. A convolutional neural network VGG-16 classification modelhas been modified to a regression network model and transfer learning was applied to suit the regression task of estimating contact position and force distribution. Several experiments were carried out using thick and thin sized tactile sensors with various shapes, such as circle, square, hexagon, for better validation of the predicted contact position, contact area, and force distribution.
Collapse
|
23
|
Abstract
SUMMARYThe various vision-based tactile sensors have been developed for robotic perception in recent years. In this paper, the novel soft robotic finger embedded with the visual sensor is proposed for perception. It consists of a colored soft inner chamber, an outer structure, and an endoscope camera. The bending perception algorithm based on image preprocessing and deep learning is proposed. The boundary of color regions and the position of marker dots are extracted from the inner chamber image and label image, respectively. Then the convolutional neural network with multi-task learning is trained to obtain bending states of the finger. Finally, the experiments are implemented to verify the effectiveness of the proposed method.
Collapse
|
24
|
Hofer M, Sferrazza C, D'Andrea R. A Vision-Based Sensing Approach for a Spherical Soft Robotic Arm. Front Robot AI 2021; 8:630935. [PMID: 33718442 PMCID: PMC7953419 DOI: 10.3389/frobt.2021.630935] [Citation(s) in RCA: 6] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Journal Information] [Subscribe] [Scholar Register] [Received: 11/18/2020] [Accepted: 01/11/2021] [Indexed: 11/23/2022] Open
Abstract
Sensory feedback is essential for the control of soft robotic systems and to enable deployment in a variety of different tasks. Proprioception refers to sensing the robot’s own state and is of crucial importance in order to deploy soft robotic systems outside of laboratory environments, i.e. where no external sensing, such as motion capture systems, is available. A vision-based sensing approach for a soft robotic arm made from fabric is presented, leveraging the high-resolution sensory feedback provided by cameras. No mechanical interaction between the sensor and the soft structure is required and consequently the compliance of the soft system is preserved. The integration of a camera into an inflatable, fabric-based bellow actuator is discussed. Three actuators, each featuring an integrated camera, are used to control the spherical robotic arm and simultaneously provide sensory feedback of the two rotational degrees of freedom. A convolutional neural network architecture predicts the two angles describing the robot’s orientation from the camera images. Ground truth data is provided by a motion capture system during the training phase of the supervised learning approach and its evaluation thereafter. The camera-based sensing approach is able to provide estimates of the orientation in real-time with an accuracy of about one degree. The reliability of the sensing approach is demonstrated by using the sensory feedback to control the orientation of the robotic arm in closed-loop.
Collapse
Affiliation(s)
- Matthias Hofer
- Institute for Dynamic Systems and Control, ETH Zurich, Zurich, Switzerland
| | - Carmelo Sferrazza
- Institute for Dynamic Systems and Control, ETH Zurich, Zurich, Switzerland
| | - Raffaello D'Andrea
- Institute for Dynamic Systems and Control, ETH Zurich, Zurich, Switzerland
| |
Collapse
|
25
|
Hsia TH, Okamoto S, Akiyama Y, Yamada Y. HumTouch: Localization of Touch on Semi-Conductive Surfaces by Sensing Human Body Antenna Signal. SENSORS 2021; 21:s21030859. [PMID: 33525367 PMCID: PMC7866186 DOI: 10.3390/s21030859] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.7] [Reference Citation Analysis] [Abstract] [Key Words] [MESH Headings] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/23/2020] [Revised: 01/23/2021] [Accepted: 01/26/2021] [Indexed: 11/30/2022]
Abstract
HumTouch is a touch sensing technology utilizing the environmental electromagnetic wave. The method can be realized using conductive and semi-conductive materials by simply attaching electrodes to the object’s surface. In this study, we compared three methods for localizing a touch on 20×16cm2 and 40×36cm2 papers, on which four or eight electrodes were attached to record the voltages leaked from the human fingertip. The number and positions of the electrodes and the data processing of the voltages differed according to the localization methods. By constructing a kernel regression analysis model between the electrode outputs and the actual physical locations, the touched locations were estimated. Each of the three methods was tested via leave-one-out cross validation. Out of the three methods discussed, two exhibited superior performances in terms of the estimation errors. Of these two methods, one simply uses the voltages recorded by the four electrodes attached on the middle of paper edges as inputs to the regression system. The other uses differential outputs of electrode pairs as the inputs. The smallest mean location errors were 0.31 cm on 20×16cm2 paper and 0.27 cm on 40×36cm2 paper, which are smaller than the size of a fingertip.
Collapse
|
26
|
Baghaei Naeini F, Makris D, Gan D, Zweiri Y. Dynamic-Vision-Based Force Measurements Using Convolutional Recurrent Neural Networks. SENSORS (BASEL, SWITZERLAND) 2020; 20:E4469. [PMID: 32785095 PMCID: PMC7472272 DOI: 10.3390/s20164469] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/03/2020] [Revised: 07/31/2020] [Accepted: 08/07/2020] [Indexed: 11/18/2022]
Abstract
In this paper, a novel dynamic Vision-Based Measurement method is proposed to measure contact force independent of the object sizes. A neuromorphic camera (Dynamic Vision Sensor) is utilizused to observe intensity changes within the silicone membrane where the object is in contact. Three deep Long Short-Term Memory neural networks combined with convolutional layers are developed and implemented to estimate the contact force from intensity changes over time. Thirty-five experiments are conducted using three objects with different sizes to validate the proposed approach. We demonstrate that the networks with memory gates are robust against variable contact sizes as the networks learn object sizes in the early stage of a grasp. Moreover, spatial and temporal features enable the sensor to estimate the contact force every 10 ms accurately. The results are promising with Mean Squared Error of less than 0.1 N for grasping and holding contact force using leave-one-out cross-validation method.
Collapse
Affiliation(s)
| | - Dimitrios Makris
- Faculty of Science, Engineering and Computing, London SW15 3DW, UK; (D.M.); (Y.Z.)
| | - Dongming Gan
- School of Engineering Technology, Purdue University, West Lafayette, IN 47907, USA;
| | - Yahya Zweiri
- Faculty of Science, Engineering and Computing, London SW15 3DW, UK; (D.M.); (Y.Z.)
- Khalifa University Center for Autonomous Robotic Systems (KUCARS), Khalifa University, Abu Dhabi P.O. Box 127788, UAE
| |
Collapse
|
27
|
Lambeta M, Chou PW, Tian S, Yang B, Maloon B, Most VR, Stroud D, Santos R, Byagowi A, Kammerer G, Jayaraman D, Calandra R. DIGIT: A Novel Design for a Low-Cost Compact High-Resolution Tactile Sensor With Application to In-Hand Manipulation. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.2977257] [Citation(s) in RCA: 62] [Impact Index Per Article: 15.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
28
|
Yu C, Lindenroth L, Hu J, Back J, Abrahams G, Liu H. A Vision-Based Soft Somatosensory System for Distributed Pressure and Temperature Sensing. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.2974649] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
29
|
Tactile Image Sensors Employing Camera: A Review. SENSORS 2019; 19:s19183933. [PMID: 31547285 PMCID: PMC6767299 DOI: 10.3390/s19183933] [Citation(s) in RCA: 48] [Impact Index Per Article: 9.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/12/2019] [Revised: 09/04/2019] [Accepted: 09/10/2019] [Indexed: 11/30/2022]
Abstract
A tactile image sensor employing a camera is capable of obtaining rich tactile information through image sequences with high spatial resolution. There have been many studies on the tactile image sensors from more than 30 years ago, and, recently, they have been applied in the field of robotics. Tactile image sensors can be classified into three typical categories according to the method of conversion from physical contact to light signals: Light conductive plate-based, marker displacement- based, and reflective membrane-based sensors. Other important elements of the sensor, such as the optical system, image sensor, and post-image analysis algorithm, have been developed. In this work, the literature is surveyed, and an overview of tactile image sensors employing a camera is provided with a focus on the sensing principle, typical design, and variation in the sensor configuration.
Collapse
|
30
|
Zhu Y, Hao J, Li W, Yang J, Dong E. A new robotic tactile sensor with bio-mimetic structural colour inspired by Morpho butterflies. BIOINSPIRATION & BIOMIMETICS 2019; 14:056010. [PMID: 31284276 DOI: 10.1088/1748-3190/ab3014] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [MESH Headings] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/09/2023]
Abstract
Since tactile perception and robotic manipulation play important roles in human survival, we propose a new method for developing robotic tactile sensors based on the structural colours of Morpho menelaus (a kind of Morpho butterfly). The first task is to fabricate a flexible bioinspired grating with a similar microstructure to the wings of Morpho menelaus using the transfer technique, onto the surfaces of polydimethylsiloxane (PDMS) films. The second task, depending on the angle of diffracted light, is to integrate the flexible diffraction grating with a polychromatic light source and a CCD camera, and then predict the position and magnitude of the contact force caused by a change in the diffraction pattern. The final task is to set up an experimental calibration platform and a marker point array with an interval of 1 mm for an image processing algorithm and a deep learning method to establish the relationship between the contact point position, and the magnitude of the force and diffraction pattern. The results showed that this tactile sensor has high sensitivity and resolution, with the position of the contact force of 1 mm. This practical application of the UR-5 manipulator verifies the feasibility of the prototype as a tactile sensor. This tactile sensing method may be widely used in robotics by miniaturising the design.
Collapse
Affiliation(s)
- Yin Zhu
- CAS Key Laboratory of Mechanical Behavior and Design of Materials, Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, Hefei, Anhui, China
| | | | | | | | | |
Collapse
|