1
|
Gao F, Ge X, Li J, Fan Y, Li Y, Zhao R. Intelligent Cockpits for Connected Vehicles: Taxonomy, Architecture, Interaction Technologies, and Future Directions. SENSORS (BASEL, SWITZERLAND) 2024; 24:5172. [PMID: 39204869 PMCID: PMC11358958 DOI: 10.3390/s24165172] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Figures] [Subscribe] [Scholar Register] [Received: 06/05/2024] [Revised: 08/01/2024] [Accepted: 08/05/2024] [Indexed: 09/04/2024]
Abstract
Highly integrated information sharing among people, vehicles, roads, and cloud systems, along with the rapid development of autonomous driving technologies, has spurred the evolution of automobiles from simple "transportation tools" to interconnected "intelligent systems". The intelligent cockpit is a comprehensive application space for various new technologies in intelligent vehicles, encompassing the domains of driving control, riding comfort, and infotainment. It provides drivers and passengers with safety, comfort, and pleasant driving experiences, serving as the gateway for traditional automobile manufacturing to upgrade towards an intelligent automotive industry ecosystem. This is the optimal convergence point for the intelligence, connectivity, electrification, and sharing of automobiles. Currently, the form, functions, and interaction methods of the intelligent cockpit are gradually changing, transitioning from the traditional "human adapts to the vehicle" viewpoint to the "vehicle adapts to human", and evolving towards a future of natural interactive services where "humans and vehicles mutually adapt". This article reviews the definitions, intelligence levels, functional domains, and technical frameworks of intelligent automotive cockpits. Additionally, combining the core mechanisms of human-machine interactions in intelligent cockpits, this article proposes an intelligent-cockpit human-machine interaction process and summarizes the current state of key technologies in intelligent-cockpit human-machine interactions. Lastly, this article analyzes the current challenges faced in the field of intelligent cockpits and forecasts future trends in intelligent cockpit technologies.
Collapse
Affiliation(s)
- Fei Gao
- College of Automotive Engineering, Jilin University, Changchun 130025, China; (F.G.); (X.G.); (J.L.); (Y.F.)
- National Key Laboratory of Automotive Chassis Integration and Bionics, Jilin University, Changchun 130025, China
| | - Xiaojun Ge
- College of Automotive Engineering, Jilin University, Changchun 130025, China; (F.G.); (X.G.); (J.L.); (Y.F.)
| | - Jinyu Li
- College of Automotive Engineering, Jilin University, Changchun 130025, China; (F.G.); (X.G.); (J.L.); (Y.F.)
| | - Yuze Fan
- College of Automotive Engineering, Jilin University, Changchun 130025, China; (F.G.); (X.G.); (J.L.); (Y.F.)
| | - Yun Li
- Graduate School of Information and Science Technology, The University of Tokyo, Tokyo 113-8654, Japan;
| | - Rui Zhao
- College of Automotive Engineering, Jilin University, Changchun 130025, China; (F.G.); (X.G.); (J.L.); (Y.F.)
| |
Collapse
|
2
|
Murali PK, Wang C, Lee D, Dahiya R, Kaboli M. Deep Active Cross-Modal Visuo-Tactile Transfer Learning for Robotic Object Recognition. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3191408] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Affiliation(s)
| | - Cong Wang
- RoboTac Lab, BMW Group, München, Germany
| | | | | | | |
Collapse
|
3
|
Murali PK, Dutta A, Gentner M, Burdet E, Dahiya R, Kaboli M. Active Visuo-Tactile Interactive Robotic Perception for Accurate Object Pose Estimation in Dense Clutter. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3150045] [Citation(s) in RCA: 3] [Impact Index Per Article: 1.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
4
|
Abah C, Orekhov AL, Johnston GLH, Simaan N. A Multi-Modal Sensor Array for Human-Robot Interaction and Confined Spaces Exploration Using Continuum Robots. IEEE SENSORS JOURNAL 2022; 22:3585-3594. [PMID: 36034075 PMCID: PMC9417101 DOI: 10.1109/jsen.2021.3140002] [Citation(s) in RCA: 2] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 06/15/2023]
Abstract
Safe human-robot interaction requires robots endowed with perception. This paper presents the design of a multi-modal sensory array for continuum robots, targeting operation in semi-structured confined spaces with human users. Active safety measures are enabled via sensory arrays capable of simultaneous sensing of proximity, contact, and force. Proximity sensing is achieved using time-of-flight sensors, while contact force is sensed using Hall effect sensors and embedded magnets. The paper presents the design and fabrication of these sensors, the communication protocol and multiplexing scheme used to allow an interactive rate of communication with a high-level controller, and an evaluation of these sensors for actively mapping the shape of the environment and compliance control using gestures and contact with the robot. Characterization of the proximity sensors is presented with considerations of sensitivity to lighting, color, and texture conditions. Also, characterization of the force sensing is presented. The results show that the multi-modal sensory array can enable pre and post-collision active safety measures and can also enable user interaction with the robot. We believe this new technology allows for increased safety for human-robot interaction in confined and semi-structures spaces due to its demonstrated capabilities of detecting impending collision and mapping the environment along the length of the robot. Future miniaturization of the electronics will also allow possible integration in smaller continuum and soft robots.
Collapse
|
5
|
Xu J, Song S, Ciocarlie M. TANDEM: Learning Joint Exploration and Decision Making with Tactile Sensors. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3193466] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
Affiliation(s)
- Jingxi Xu
- Department of Computer Science, Columbia University, New York, NY, USA
| | - Shuran Song
- Department of Computer Science, Columbia University, New York, NY, USA
| | - Matei Ciocarlie
- Department of Mechanical Engineering, Columbia University, New York, NY, USA
| |
Collapse
|
6
|
Escaida Navarro S, Muhlbacher-Karrer S, Alagi H, Zangl H, Koyama K, Hein B, Duriez C, Smith JR. Proximity Perception in Human-Centered Robotics: A Survey on Sensing Systems and Applications. IEEE T ROBOT 2022. [DOI: 10.1109/tro.2021.3111786] [Citation(s) in RCA: 7] [Impact Index Per Article: 3.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
7
|
Pastor F, Garcia-Gonzalez J, Gandarias JM, Medina D, Closas P, Garcia-Cerezo AJ, Gomez-de-Gabriel JM. Bayesian and Neural Inference on LSTM-Based Object Recognition From Tactile and Kinesthetic Information. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2020.3038377] [Citation(s) in RCA: 11] [Impact Index Per Article: 3.7] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
8
|
Li Q, Kroemer O, Su Z, Veiga FF, Kaboli M, Ritter HJ. A Review of Tactile Information: Perception and Action Through Touch. IEEE T ROBOT 2020. [DOI: 10.1109/tro.2020.3003230] [Citation(s) in RCA: 53] [Impact Index Per Article: 13.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
9
|
Miyamoto T, Sasaki H, Matsubara T. Exploiting Visual-Outer Shape for Tactile-Inner Shape Estimation of Objects Covered with Soft Materials. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.3013915] [Citation(s) in RCA: 2] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
|
10
|
Jang J, Jun YS, Seo H, Kim M, Park JU. Motion Detection Using Tactile Sensors Based on Pressure-Sensitive Transistor Arrays. SENSORS (BASEL, SWITZERLAND) 2020; 20:E3624. [PMID: 32605148 PMCID: PMC7374490 DOI: 10.3390/s20133624] [Citation(s) in RCA: 20] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 06/01/2020] [Revised: 06/20/2020] [Accepted: 06/23/2020] [Indexed: 01/17/2023]
Abstract
In recent years, to develop more spontaneous and instant interfaces between a system and users, technology has evolved toward designing efficient and simple gesture recognition (GR) techniques. As a tool for acquiring human motion, a tactile sensor system, which converts the human touch signal into a single datum and executes a command by translating a bundle of data into a text language or triggering a preset sequence as a haptic motion, has been developed. The tactile sensor aims to collect comprehensive data on various motions, from the touch of a fingertip to large body movements. The sensor devices have different characteristics that are important for target applications. Furthermore, devices can be fabricated using various principles, and include piezoelectric, capacitive, piezoresistive, and field-effect transistor types, depending on the parameters to be achieved. Here, we introduce tactile sensors consisting of field-effect transistors (FETs). GR requires a process involving the acquisition of a large amount of data in an array rather than a single sensor, suggesting the importance of fabricating a tactile sensor as an array. In this case, an FET-type pressure sensor can exploit the advantages of active-matrix sensor arrays that allow high-array uniformity, high spatial contrast, and facile integration with electrical circuitry. We envision that tactile sensors based on FETs will be beneficial for GR as well as future applications, and these sensors will provide substantial opportunities for next-generation motion sensing systems.
Collapse
Affiliation(s)
- Jiuk Jang
- Nano Science Technology Institute, Department of Materials Science and Engineering, Yonsei University, Seoul 03722, Korea; (J.J.); (Y.S.J.); (H.S.); (M.K.)
- Center for Nanomedicine, Institute for Basic Science (IBS), Seoul 03722, Korea
- Graduate Program of Nano Biomedical Engineering (NanoBME), Advanced Science Institute, Yonsei University, Seoul 03722, Korea
| | - Yoon Sun Jun
- Nano Science Technology Institute, Department of Materials Science and Engineering, Yonsei University, Seoul 03722, Korea; (J.J.); (Y.S.J.); (H.S.); (M.K.)
- Center for Nanomedicine, Institute for Basic Science (IBS), Seoul 03722, Korea
- Graduate Program of Nano Biomedical Engineering (NanoBME), Advanced Science Institute, Yonsei University, Seoul 03722, Korea
| | - Hunkyu Seo
- Nano Science Technology Institute, Department of Materials Science and Engineering, Yonsei University, Seoul 03722, Korea; (J.J.); (Y.S.J.); (H.S.); (M.K.)
- Center for Nanomedicine, Institute for Basic Science (IBS), Seoul 03722, Korea
- Graduate Program of Nano Biomedical Engineering (NanoBME), Advanced Science Institute, Yonsei University, Seoul 03722, Korea
| | - Moohyun Kim
- Nano Science Technology Institute, Department of Materials Science and Engineering, Yonsei University, Seoul 03722, Korea; (J.J.); (Y.S.J.); (H.S.); (M.K.)
- Center for Nanomedicine, Institute for Basic Science (IBS), Seoul 03722, Korea
- Graduate Program of Nano Biomedical Engineering (NanoBME), Advanced Science Institute, Yonsei University, Seoul 03722, Korea
| | - Jang-Ung Park
- Nano Science Technology Institute, Department of Materials Science and Engineering, Yonsei University, Seoul 03722, Korea; (J.J.); (Y.S.J.); (H.S.); (M.K.)
- Center for Nanomedicine, Institute for Basic Science (IBS), Seoul 03722, Korea
- Graduate Program of Nano Biomedical Engineering (NanoBME), Advanced Science Institute, Yonsei University, Seoul 03722, Korea
| |
Collapse
|
11
|
Abstract
Abstract
Purpose of Review
The review presents an overview of research approaches on human-robot interfaces in industrial and service robotics.
Recent Findings
Research approaches address especially speech and gesture recognition in both fields but are more explored in service robotics. However, the importance of interfaces in industrial robots increases.
Summary
The development of human-robot interfaces is leading towards intuitive interfaces, especially using speech and gesture recognition and combining them to multimodal interfaces.
Collapse
|
12
|
|
13
|
|
14
|
Matsuno T, Wang Z, Althoefer K, Hirai S. Adaptive Update of Reference Capacitances in Conductive Fabric Based Robotic Skin. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2901991] [Citation(s) in RCA: 11] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
15
|
Jiang H, Yan Y, Zhu X, Zhang C. A 3-D Surface Reconstruction with Shadow Processing for Optical Tactile Sensors. SENSORS 2018; 18:s18092785. [PMID: 30149551 PMCID: PMC6163909 DOI: 10.3390/s18092785] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/01/2018] [Revised: 08/13/2018] [Accepted: 08/21/2018] [Indexed: 11/16/2022]
Abstract
An optical tactile sensor technique with 3-dimension (3-D) surface reconstruction is proposed for robotic fingers. The hardware of the tactile sensor consists of a surface deformation sensing layer, an image sensor and four individually controlled flashing light emitting diodes (LEDs). The image sensor records the deformation images when the robotic finger touches an object. For each object, four deformation images are taken with the LEDs providing different illumination directions. Before the 3-D reconstruction, the look-up tables are built to map the intensity distribution to the image gradient data. The possible image shadow will be detected and amended. Then the 3-D depth distribution of the object surface can be reconstructed from the 2-D gradient obtained using the look-up tables. The architecture of the tactile sensor and the proposed signal processing flow have been presented in details. A prototype tactile sensor has been built. Both the simulation and experimental results have validated the effectiveness of the proposed 3-D surface reconstruction method for the optical tactile sensors. The proposed 3-D surface reconstruction method has the unique feature of image shadow detection and compensation, which differentiates itself from those in the literature.
Collapse
Affiliation(s)
- Hanjun Jiang
- Institute of Microelectronics, Tsinghua University, Beijing 100084, China.
| | - Yan Yan
- Institute of Microelectronics, Tsinghua University, Beijing 100084, China.
| | - Xiyang Zhu
- Institute of Microelectronics, Tsinghua University, Beijing 100084, China.
| | - Chun Zhang
- Institute of Microelectronics, Tsinghua University, Beijing 100084, China.
| |
Collapse
|
16
|
Robust Tactile Descriptors for Discriminating Objects From Textural Properties via Artificial Robotic Skin. IEEE T ROBOT 2018. [DOI: 10.1109/tro.2018.2830364] [Citation(s) in RCA: 47] [Impact Index Per Article: 7.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
17
|
Hughes D, Lammie J, Correll N. A Robotic Skin for Collision Avoidance and Affective Touch Recognition. IEEE Robot Autom Lett 2018. [DOI: 10.1109/lra.2018.2799743] [Citation(s) in RCA: 36] [Impact Index Per Article: 6.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
18
|
Recent Progress in Technologies for Tactile Sensors. SENSORS 2018; 18:s18040948. [PMID: 29565835 PMCID: PMC5948515 DOI: 10.3390/s18040948] [Citation(s) in RCA: 61] [Impact Index Per Article: 10.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/17/2018] [Revised: 03/13/2018] [Accepted: 03/16/2018] [Indexed: 02/07/2023]
Abstract
Over the last two decades, considerable scientific and technological efforts have been devoted to developing tactile sensing based on a variety of transducing mechanisms, with prospective applications in many fields such as human–machine interaction, intelligent robot tactile control and feedback, and tactile sensorized minimally invasive surgery. This paper starts with an introduction of human tactile systems, followed by a presentation of the basic demands of tactile sensors. State-of-the-art tactile sensors are reviewed in terms of their diverse sensing mechanisms, design consideration, and material selection. Subsequently, typical performances of the sensors, along with their advantages and disadvantages, are compared and analyzed. Two major potential applications of tactile sensing systems are discussed in detail. Lastly, we propose prospective research directions and market trends of tactile sensing systems.
Collapse
|
19
|
Kaboli M, Feng D, Cheng G. Active Tactile Transfer Learning for Object Discrimination in an Unstructured Environment Using Multimodal Robotic Skin. INT J HUM ROBOT 2018. [DOI: 10.1142/s0219843618500019] [Citation(s) in RCA: 27] [Impact Index Per Article: 4.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/18/2022]
Abstract
In this paper, we propose a probabilistic active tactile transfer learning (ATTL) method to enable robotic systems to exploit their prior tactile knowledge while discriminating among objects via their physical properties (surface texture, stiffness, and thermal conductivity). Using the proposed method, the robot autonomously selects and exploits its most relevant prior tactile knowledge to efficiently learn about new unknown objects with a few training samples or even one. The experimental results show that using our proposed method, the robot successfully discriminated among new objects with [Formula: see text] discrimination accuracy using only one training sample (on-shot-tactile-learning). Furthermore, the results demonstrate that our method is robust against transferring irrelevant prior tactile knowledge (negative tactile knowledge transfer).
Collapse
Affiliation(s)
- Mohsen Kaboli
- The Institute for Cognitive Systems, Technical University of Munich, Arcisstrasse 21 80333, Munich, Germany
| | - Di Feng
- The Institute for Cognitive Systems, Technical University of Munich, Arcisstrasse 21 80333, Munich, Germany
| | - Gordon Cheng
- The Institute for Cognitive Systems, Technical University of Munich, Arcisstrasse 21 80333, Munich, Germany
| |
Collapse
|
20
|
Active Prior Tactile Knowledge Transfer for Learning Tactual Properties of New Objects. SENSORS 2018; 18:s18020634. [PMID: 29466300 PMCID: PMC5855872 DOI: 10.3390/s18020634] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/01/2017] [Revised: 02/14/2018] [Accepted: 02/16/2018] [Indexed: 12/04/2022]
Abstract
Reusing the tactile knowledge of some previously-explored objects (prior objects) helps us to easily recognize the tactual properties of new objects. In this paper, we enable a robotic arm equipped with multi-modal artificial skin, like humans, to actively transfer the prior tactile exploratory action experiences when it learns the detailed physical properties of new objects. These experiences, or prior tactile knowledge, are built by the feature observations that the robot perceives from multiple sensory modalities, when it applies the pressing, sliding, and static contact movements on objects with different action parameters. We call our method Active Prior Tactile Knowledge Transfer (APTKT), and systematically evaluated its performance by several experiments. Results show that the robot improved the discrimination accuracy by around 10% when it used only one training sample with the feature observations of prior objects. By further incorporating the predictions from the observation models of prior objects as auxiliary features, our method improved the discrimination accuracy by over 20%. The results also show that the proposed method is robust against transferring irrelevant prior tactile knowledge (negative knowledge transfer).
Collapse
|
21
|
Kaboli M, Yao K, Feng D, Cheng G. Tactile-based active object discrimination and target object search in an unknown workspace. Auton Robots 2018. [DOI: 10.1007/s10514-018-9707-8] [Citation(s) in RCA: 30] [Impact Index Per Article: 5.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/29/2022]
|