1
|
Cui S, Liang J. Three-dimensional mapping of pipelines using laser ranging and a gyroscope. Sci Rep 2023; 13:20330. [PMID: 37989766 PMCID: PMC10663559 DOI: 10.1038/s41598-023-47856-5] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Received: 04/08/2023] [Accepted: 11/19/2023] [Indexed: 11/23/2023] Open
Abstract
Conventional underground gas pipeline equipment can only map straight pipelines at close distances. As such, a dual-robot mapping technique is proposed in this study, based on laser ranging and a gyroscope, for application to straight, bent, or sloped underground pipelines. This process is facilitated by two robots (a detecting vehicle and a mapping vehicle) alternately moving from the entrance to the exit of a pipe system. The two vehicles acquire data using individual distance sensors and a gyroscope, producing 3D maps of each section. Mapping results can then be produced for the entire pipeline by splicing images from multiple sections. Simulated validation experiments were conducted using horizontal straight, bent horizontal fixed turning radius, sloped uphill or downhill, and composite sections. Results showed this approach could effectively map different types of pipelines with inner diameters of 300-500 mm. Distance errors over 2000 m of travel were within 1 m, while the angular error was within 1.5°, demonstrating our approach to be highly accurate for complex pipeline system mapping.
Collapse
Affiliation(s)
- Songtao Cui
- School of Mechanical and Power Engineering, Zhengzhou University, Zhengzhou, 450001, China
| | - Jie Liang
- School of Mechanical and Power Engineering, Zhengzhou University, Zhengzhou, 450001, China.
| |
Collapse
|
2
|
Keizer R, Dubay R, Waugh L, Bradley C. Architecture for a Mobile Robotic Camera Positioning System for Photogrammetric Data Acquisition in Hydroelectric Tunnels. SENSORS (BASEL, SWITZERLAND) 2023; 23:7079. [PMID: 37631619 PMCID: PMC10459461 DOI: 10.3390/s23167079] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/30/2023] [Revised: 08/02/2023] [Accepted: 08/07/2023] [Indexed: 08/27/2023]
Abstract
The structural condition of hydroelectric tunnels is important to the overall performance, safety, and longevity of generating stations. Significant effort is required to inspect, monitor, and maintain these tunnels. Photogrammetry is an effective method of collecting highly accurate visual and spatial data. However, it also presents the complex challenge of positioning a camera at thousands of difficult-to-reach locations throughout the large and varying-diameter tunnels. A semi-automated robotic camera positioning system was developed to enhance the collection of images within hydroelectric tunnels for photogrammetric inspections. A continuous spiral image network was developed to optimize the collection speed within the bounds of photography and capture-in-motion constraints. The positioning system and image network optimization reduce the time and effort required while providing the ability to adapt to different and varying tunnel diameters. To demonstrate, over 28,000 images were captured at a ground sampling distance of 0.4 mm in the 822 m long concrete-lined section of the Grand Falls Generating Station intake tunnel.
Collapse
Affiliation(s)
- Ryan Keizer
- Department of Mechanical Engineering, University of New Brunswick, Fredericton, NB E3B 5A3, Canada;
| | - Rickey Dubay
- Department of Mechanical Engineering, University of New Brunswick, Fredericton, NB E3B 5A3, Canada;
| | - Lloyd Waugh
- Department of Civil Engineering, University of New Brunswick, Fredericton, NB E3B 5A3, Canada;
| | - Cody Bradley
- Bradley Engineering Ltd., Estey’s Bridge, NB E3G 6M7, Canada;
| |
Collapse
|
3
|
Cui Y, Liu S, Li H, Gu C, Jiang H, Meng D. Accurate integrated position measurement system for mobile applications in GPS-denied coal mine. ISA TRANSACTIONS 2023; 139:621-634. [PMID: 37142491 DOI: 10.1016/j.isatra.2023.04.014] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 06/02/2022] [Revised: 03/21/2023] [Accepted: 04/14/2023] [Indexed: 05/06/2023]
Abstract
The automatic positioning of underground mobile applications plays a crucial role in enabling intelligent coal mining. However, due to the diverse kinematics and dynamics of these applications, various positioning methods have been proposed to match different targets. Nonetheless, the accuracy and applicability of these methods still fall short of meeting the requirements for field applications. Based on the vibration characteristics of underground mobile devices, a multi-sensor fusion positioning system is developed to enhance the accuracy of positioning in long and narrow global positioning system denied (GPS-denied) underground coal mine roadways. The system combines inertial navigation (INS), odometer, and ultra wide band (UWB) technologies through extended Kalman filter (EKF) and unscented Kalman filter (UKF). This approach enables accurate positioning by recognizing target carrier vibrations and facilitating fast conversion between multi-sensor fusion modes. The proposed system is tested on both a small unmanned mine vehicle (UMV) and a large roadheader, demonstrating that UKF enhances stability for roadheaders with strong nonlinear vibrations while EKF is more suitable for flexible UMVs. Detailed results confirm that the proposed system achieves an accuracy level of 0.15 m, meeting most coal mine application requirements.
Collapse
Affiliation(s)
- Yuming Cui
- School of Mechatronic Engineering, Jiangsu Normal University, Xuzhou, 221116, China.
| | - Songyong Liu
- School of Mechatronic Engineering, China University of Mining and Technology, Xuzhou, 221116, China.
| | - Hongsheng Li
- School of Mechatronic Engineering, China University of Mining and Technology, Xuzhou, 221116, China
| | - Congcong Gu
- School of Mechatronic Engineering, China University of Mining and Technology, Xuzhou, 221116, China
| | - Hongxiang Jiang
- School of Mechatronic Engineering, China University of Mining and Technology, Xuzhou, 221116, China; Jiangsu Collaborative Innovation Center of Intelligent Mining Equipment, China University of Mining and Technology, Xuzhou 221008, China
| | - Deyuan Meng
- School of Mechatronic Engineering, China University of Mining and Technology, Xuzhou, 221116, China
| |
Collapse
|
4
|
Zhao M, Okada K, Inaba M. Versatile articulated aerial robot DRAGON: Aerial manipulation and grasping by vectorable thrust control. Int J Rob Res 2022. [DOI: 10.1177/02783649221112446] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Various state-of-the-art works have achieved aerial manipulation and grasping by attaching additional manipulator to aerial robots. However, such a coupled platform has limitations with respect to the interaction force and mobility. In this paper, we present the successful implementation of aerial manipulation and grasping by a novel articulated aerial robot called DRAGON, in which a vectorable rotor unit is embedded in each link. The key to performing stable manipulation and grasping in the air is the usage of rotor vectoring apparatus having two degrees-of-freedom. First, a comprehensive flight control methodology for aerial transformation using the vectorable thrust force is developed with the consideration of the dynamics of vectoring actuators. This proposed control method can suppress the oscillation due to the dynamics of vectoring actuators and also allow the integration with external and internal wrenches for object manipulation and grasping. Second, an online thrust-level planning method for bimanual object grasping using the two ends of this articulated model is presented. The proposed grasping style is unique in that the vectorable thrust force is used as the internal wrench instead of the joint torque. Finally, we show the experimental results of evaluation on the proposed control and planning methods for object manipulation and grasping.
Collapse
Affiliation(s)
- Moju Zhao
- Department of Mechanical Engineering, The Graduate School of Engineering, The University of Tokyo, Tokyo, Japan
| | - Kei Okada
- Department of Mechano-Infomatics, The Graduate School of Information Science and Technology, The University of Tokyo, Tokyo, Japan
| | - Masayuki Inaba
- Department of Mechano-Infomatics, The Graduate School of Information Science and Technology, The University of Tokyo, Tokyo, Japan
| |
Collapse
|
5
|
Lindqvist B, Kanellakis C, Mansouri SS, Agha-mohammadi AA, Nikolakopoulos G. COMPRA: A COMPact Reactive Autonomy Framework for Subterranean MAV Based Search-And-Rescue Operations. J INTELL ROBOT SYST 2022. [DOI: 10.1007/s10846-022-01665-6] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/17/2022]
Abstract
AbstractThis work establishes COMPRA, a compact and reactive autonomy framework for fast deployment of Micro Aerial Vehicles (MAVs) in subterranean Search-and- Rescue (SAR) missions. A COMPRA-enabled MAV is able to autonomously explore previously unknown areas while specific mission criteria are considered e.g. an object of interest is identified and localized, the remaining useful battery life, the overall desired exploration mission duration. The proposed architecture follows a low-complexity algorithmic design to facilitate fully on-board computations, including nonlinear control, state-estimation, navigation, exploration behavior and object localization capabilities. The framework is mainly structured around a reactive local avoidance planner, based on enhanced Potential Field concepts and using instantaneous 3D pointclouds, as well as a computationally efficient heading regulation technique, based on depth images from an instantaneous camera stream. Those techniques decouple the collision-free path generation from the dependency of a global map and are capable of handling imprecise localization occasions. Field experimental verification of the overall architecture is performed in relevant unknown Global Positioning System (GPS)-denied environments.
Collapse
|
6
|
Wang L, Xu H, Zhang Y, Shen S. Neither Fast nor Slow: How to Fly Through Narrow Tunnels. IEEE Robot Autom Lett 2022. [DOI: 10.1109/lra.2022.3154024] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
7
|
An Effective Algorithm for Finding Shortest Paths in Tubular Spaces. ALGORITHMS 2022. [DOI: 10.3390/a15030079] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
We propose a novel algorithm to determine the Euclidean shortest path (ESP) from a given point (source) to another point (destination) inside a tubular space. The method is based on the observation data of a virtual particle (VP) assumed to move along this path. In the first step, the geometric properties of the shortest path inside the considered space are presented and proven. Utilizing these properties, the desired ESP can be segmented into three partitions depending on the visibility of the VP. Our algorithm will check which partition the VP belongs to and calculate the correct direction of its movement, and thus the shortest path will be traced. The proposed method is then compared to Dijkstra’s algorithm, considering different types of tubular spaces. In all cases, the solution provided by the proposed algorithm is smoother, shorter, and has a higher accuracy with a faster calculation speed than that obtained by Dijkstra’s method.
Collapse
|
8
|
Drone Control in AR: An Intuitive System for Single-Handed Gesture Control, Drone Tracking, and Contextualized Camera Feed Visualization in Augmented Reality. DRONES 2022. [DOI: 10.3390/drones6020043] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 02/01/2023]
Abstract
Traditional drone handheld remote controllers, although well-established and widely used, are not a particularly intuitive control method. At the same time, drone pilots normally watch the drone video feed on a smartphone or another small screen attached to the remote. This forces them to constantly shift their visual focus from the drone to the screen and vice-versa. This can be an eye-and-mind-tiring and stressful experience, as the eyes constantly change focus and the mind struggles to merge two different points of view. This paper presents a solution based on Microsoft’s HoloLens 2 headset that leverages augmented reality and gesture recognition to make drone piloting easier, more comfortable, and more intuitive. It describes a system for single-handed gesture control that can achieve all maneuvers possible with a traditional remote, including complex motions; a method for tracking a real drone in AR to improve flying beyond line of sight or at distances where the physical drone is hard to see; and the option to display the drone’s live video feed in AR, either in first-person-view mode or in context with the environment.
Collapse
|
9
|
Design and Experimental Evaluation of an Aerial Solution for Visual Inspection of Tunnel-like Infrastructures. REMOTE SENSING 2022. [DOI: 10.3390/rs14010195] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.5] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
Current railway tunnel inspections rely on expert operators performing a visual examination of the entire infrastructure and manually annotating encountered defects. Automatizing the inspection and maintenance task of such critical and aging infrastructures has the potential to decrease the associated costs and risks. Contributing to this aim, the present work describes an aerial robotic solution designed to perform autonomous inspections of tunnel-like infrastructures. The proposed robotic system is equipped with visual and thermal sensors and uses an inspection-driven path planning algorithm to generate a path that maximizes the quality of the gathered data in terms of photogrammetry goals while optimizing the surface coverage and the total trajectory length. The performance of the planning algorithm is demonstrated in simulation against state-of-the-art methods and a wall-following inspection trajectory. Results of a real inspection test conducted in a railway tunnel are also presented, validating the whole system operation.
Collapse
|
10
|
Loquercio A, Kaufmann E, Ranftl R, Müller M, Koltun V, Scaramuzza D. Learning high-speed flight in the wild. Sci Robot 2021; 6:eabg5810. [PMID: 34613820 DOI: 10.1126/scirobotics.abg5810] [Citation(s) in RCA: 21] [Impact Index Per Article: 7.0] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/02/2022]
Abstract
Quadrotors are agile. Unlike most other machines, they can traverse extremely complex environments at high speeds. To date, only expert human pilots have been able to fully exploit their capabilities. Autonomous operation with onboard sensing and computation has been limited to low speeds. State-of-the-art methods generally separate the navigation problem into subtasks: sensing, mapping, and planning. Although this approach has proven successful at low speeds, the separation it builds upon can be problematic for high-speed navigation in cluttered environments. The subtasks are executed sequentially, leading to increased processing latency and a compounding of errors through the pipeline. Here, we propose an end-to-end approach that can autonomously fly quadrotors through complex natural and human-made environments at high speeds with purely onboard sensing and computation. The key principle is to directly map noisy sensory observations to collision-free trajectories in a receding-horizon fashion. This direct mapping drastically reduces processing latency and increases robustness to noisy and incomplete perception. The sensorimotor mapping is performed by a convolutional network that is trained exclusively in simulation via privileged learning: imitating an expert with access to privileged information. By simulating realistic sensor noise, our approach achieves zero-shot transfer from simulation to challenging real-world environments that were never experienced during training: dense forests, snow-covered terrain, derailed trains, and collapsed buildings. Our work demonstrates that end-to-end policies trained in simulation enable high-speed autonomous flight through challenging environments, outperforming traditional obstacle avoidance pipelines.
Collapse
|
11
|
Kim J, Lesak MC, Taylor D, Gonzalez DJ, Korpela CM. Autonomous Quadrotor Landing on Inclined Surfaces Using Perception-Guided Active Asymmetric Skids. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3101869] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
12
|
Kong D, Zhang Y, Dai W. Direct Near-Infrared-Depth Visual SLAM With Active Lighting. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3096741] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
13
|
Abstract
Because of their high maneuverability and fast deployment times, aerial robots have recently gained popularity for automating inspection tasks. In this paper, we address the visual inspection of vessel cargo holds, aiming at safer, cost-efficient and more intensive visual inspections of ships by means of a multirotor-type platform. To this end, the vehicle is equipped with a sensor suite able to supply the surveyor with imagery from relevant areas, while the control software is supporting the operator during flight with enhanced functionalities and reliable autonomy. All this has been accomplished in the context of the supervised autonomy (SA) paradigm, by means of extensive use of behaviour-based high-level control (including obstacle detection and collision prevention), all specifically devised for visual inspection. The full system has been evaluated both in laboratory and in real environments, on-board two different vessels. Results show the vehicle effective for the referred application, in particular due to the inspection-oriented capabilities it has been fitted with.
Collapse
|
14
|
|
15
|
Li G, Ge R, Loianno G. Cooperative Transportation of Cable Suspended Payloads With MAVs Using Monocular Vision and Inertial Sensing. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2021.3065286] [Citation(s) in RCA: 9] [Impact Index Per Article: 3.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
16
|
A method for autonomous collision-free navigation of a quadrotor UAV in unknown tunnel-like environments. ROBOTICA 2021. [DOI: 10.1017/s0263574721000849] [Citation(s) in RCA: 10] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Abstract
Abstract
Unmanned aerial vehicles (UAVs) have become essential tools for exploring, mapping and inspection of unknown three-dimensional (3D) tunnel-like environments which is a very challenging problem. A computationally light navigation algorithm is developed in this paper for quadrotor UAVs to autonomously guide the vehicle through such environments. It uses sensors observations to safely guide the UAV along the tunnel axis while avoiding collisions with its walls. The approach is evaluated using several computer simulations with realistic sensing models and practical implementation with a quadrotor UAV. The proposed method is also applicable to other UAV types and autonomous underwater vehicles.
Collapse
|
17
|
Kim J, Jeon MH, Cho Y, Kim A. Dark Synthetic Vision: Lightweight Active Vision to Navigate in the Dark. IEEE Robot Autom Lett 2021. [DOI: 10.1109/lra.2020.3035137] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
18
|
Rubio-Sierra C, Domínguez D, Gonzalo J, Escapa A. Path Planner for Autonomous Exploration of Underground Mines by Aerial Vehicles. SENSORS 2020; 20:s20154259. [PMID: 32751686 PMCID: PMC7435854 DOI: 10.3390/s20154259] [Citation(s) in RCA: 3] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Key Words] [Grants] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 07/02/2020] [Revised: 07/24/2020] [Accepted: 07/27/2020] [Indexed: 11/16/2022]
Abstract
This paper presents a path planner solution that makes it possible to autonomously explore underground mines with aerial robots (typically multicopters). In these environments the operations may be limited by many factors like the lack of external navigation signals, the narrow passages and the absence of radio communications. The designed path planner is defined as a simple and highly computationally efficient algorithm that, only relying on a laser imaging detection and ranging (LIDAR) sensor with Simultaneous localization and mapping (SLAM) capability, permits the exploration of a set of single-level mining tunnels. It performs dynamic planning based on exploration vectors, a novel variant of the open sector method with reinforced filtering. The algorithm incorporates global awareness and obstacle avoidance modules. The first one prevents the possibility of getting trapped in a loop, whereas the second one facilitates the navigation along narrow tunnels. The performance of the proposed solution has been tested in different study cases with a Hardware-in-the-loop (HIL) simulator developed for this purpose. In all situations the path planner logic performed as expected and the used routing was optimal. Furthermore, the path efficiency, measured in terms of traveled distance and used time, was high when compared with an ideal reference case. The result is a very fast, real-time, and static memory capable algorithm, which implemented on the proposed architecture presents a feasible solution for the autonomous exploration of underground mines.
Collapse
|
19
|
Zhao M, Shi F, Anzai T, Okada K, Inaba M. Online Motion Planning for Deforming Maneuvering and Manipulation by Multilinked Aerial Robot Based on Differential Kinematics. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.2967285] [Citation(s) in RCA: 10] [Impact Index Per Article: 2.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/10/2022]
|
20
|
Petracek P, Kratky V, Saska M. Dronument: System for Reliable Deployment of Micro Aerial Vehicles in Dark Areas of Large Historical Monuments. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.2969935] [Citation(s) in RCA: 13] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
21
|
Petrlik M, Baca T, Hert D, Vrba M, Krajnik T, Saska M. A Robust UAV System for Operations in a Constrained Environment. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.2970980] [Citation(s) in RCA: 50] [Impact Index Per Article: 12.5] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
22
|
Lee ES, Loianno G, Thakur D, Kumar V. Experimental Evaluation and Characterization of Radioactive Source Effects on Robot Visual Localization and Mapping. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.2975723] [Citation(s) in RCA: 8] [Impact Index Per Article: 2.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
23
|
Affiliation(s)
- Shehryar Khattak
- Department of Computer Science and Engineering University of Nevada Reno Nevada
| | | | - Kostas Alexis
- Department of Computer Science and Engineering University of Nevada Reno Nevada
| |
Collapse
|
24
|
A Drag Model-LIDAR-IMU Fault-Tolerance Fusion Method for Quadrotors. SENSORS 2019; 19:s19194337. [PMID: 31597280 PMCID: PMC6806240 DOI: 10.3390/s19194337] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 09/05/2019] [Revised: 09/23/2019] [Accepted: 10/01/2019] [Indexed: 11/30/2022]
Abstract
In this paper, a drag model-aided fault-tolerant state estimation method is presented for quadrotors. Firstly, the drag model accuracy was improved by modeling an angular rate related item and an angular acceleration related item, which are related with flight maneuver. Then the drag model, light detection and ranging (LIDAR), and inertial measurement unit (IMU) were fused based on the Federal Kalman filter frame. In the filter, the LIDAR estimation fault was detected and isolated, and the disturbance to the drag model was estimated and compensated. Some experiments were carried out, showing that the velocity and position estimation were improved compared with the traditional LIDAR/IMU fusion scheme.
Collapse
|
25
|
Nguyen T, Wozencraft J, Taylor CJ, Kumar V, Shivakumar SS, Miller ID, Keller J, Lee ES, Zhou A, Ozaslan T, Loianno G, Harwood JH. MAVNet: An Effective Semantic Segmentation Micro-Network for MAV-Based Tasks. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2928734] [Citation(s) in RCA: 17] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
26
|
Kar AK, Dhar NK, Mishra PK, Verma NK. Relative Vehicle Displacement Approach for Path Tracking Adaptive Controller With Multisampling Data Transmission. IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE 2019. [DOI: 10.1109/tetci.2018.2865205] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
27
|
Tardioli D, Riazuelo L, Sicignano D, Rizzo C, Lera F, Villarroel JL, Montano L. Ground robotics in tunnels: Keys and lessons learned after 10 years of research and experiments. J FIELD ROBOT 2019. [DOI: 10.1002/rob.21871] [Citation(s) in RCA: 19] [Impact Index Per Article: 3.8] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
Affiliation(s)
- Danilo Tardioli
- Centro Universitario de la Defensa Zaragoza Spain
- Instituto de Investigación en Ingeniería de AragónUniversity of Zaragoza Zaragoza Spain
| | - Luis Riazuelo
- Instituto de Investigación en Ingeniería de AragónUniversity of Zaragoza Zaragoza Spain
| | - Domenico Sicignano
- Instituto de Investigación en Ingeniería de AragónUniversity of Zaragoza Zaragoza Spain
| | - Carlos Rizzo
- EurecatCentre Tecnològic de Catalunya, Robotics and Automation Unit Barcelona Spain
| | - Francisco Lera
- Instituto de Investigación en Ingeniería de AragónUniversity of Zaragoza Zaragoza Spain
| | - José L. Villarroel
- Instituto de Investigación en Ingeniería de AragónUniversity of Zaragoza Zaragoza Spain
| | - Luis Montano
- Instituto de Investigación en Ingeniería de AragónUniversity of Zaragoza Zaragoza Spain
| |
Collapse
|
28
|
Thomason J, Ratsamee P, Orlosky J, Kiyokawa K, Mashita T, Uranishi Y, Takemura H. A Comparison of Adaptive View Techniques for Exploratory 3D Drone Teleoperation. ACM T INTERACT INTEL 2019. [DOI: 10.1145/3232232] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.2] [Reference Citation Analysis] [Abstract] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/27/2022]
Abstract
Drone navigation in complex environments poses many problems to teleoperators. Especially in three dimensional (3D) structures such as buildings or tunnels, viewpoints are often limited to the drone’s current camera view, nearby objects can be collision hazards, and frequent occlusion can hinder accurate manipulation.
To address these issues, we have developed a novel interface for teleoperation that provides a user with environment-adaptive viewpoints that are automatically configured to improve safety and provide smooth operation. This real-time adaptive viewpoint system takes robot position, orientation, and 3D point-cloud information into account to modify the user’s viewpoint to maximize visibility. Our prototype uses simultaneous localization and mapping (SLAM) based reconstruction with an omnidirectional camera, and we use the resulting models as well as simulations in a series of preliminary experiments testing navigation of various structures. Results suggest that automatic viewpoint generation can outperform first- and third-person view interfaces for virtual teleoperators in terms of ease of control and accuracy of robot operation.
Collapse
|
29
|
Svacha J, Loianno G, Kumar V. Inertial Yaw-Independent Velocity and Attitude Estimation for High-Speed Quadrotor Flight. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2894220] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
30
|
Yuan L, Reardon C, Warnell G, Loianno G. Human Gaze-Driven Spatial Tasking of an Autonomous MAV. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2895419] [Citation(s) in RCA: 16] [Impact Index Per Article: 3.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/08/2022]
|
31
|
Design Optimization of Sparse Sensing Array for Extended Aerial Robot Navigation in Deep Hazardous Tunnels. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2892796] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
32
|
Pandya H, Gaud A, Kumar G, Krishna KM. Instance invariant visual servoing framework for part-aware autonomous vehicle inspection using MAVs. J FIELD ROBOT 2019. [DOI: 10.1002/rob.21859] [Citation(s) in RCA: 6] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/11/2022]
Affiliation(s)
| | - Ayush Gaud
- IIIT-Hyderabad; Hyderabad Telangana India
| | | | | |
Collapse
|
33
|
Ozaslan T, Loianno G, Keller J, Taylor CJ, Kumar V. Spatio-Temporally Smooth Local Mapping and State Estimation Inside Generalized Cylinders With Micro Aerial Vehicles. IEEE Robot Autom Lett 2018. [DOI: 10.1109/lra.2018.2861888] [Citation(s) in RCA: 7] [Impact Index Per Article: 1.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|
34
|
Loianno G, Spurny V, Thomas J, Baca T, Thakur D, Hert D, Penicka R, Krajnik T, Zhou A, Cho A, Saska M, Kumar V. Localization, Grasping, and Transportation of Magnetic Objects by a Team of MAVs in Challenging Desert-Like Environments. IEEE Robot Autom Lett 2018. [DOI: 10.1109/lra.2018.2800121] [Citation(s) in RCA: 50] [Impact Index Per Article: 8.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
35
|
Quenzel J, Nieuwenhuisen M, Droeschel D, Beul M, Houben S, Behnke S. Autonomous MAV-based Indoor Chimney Inspection with 3D Laser Localization and Textured Surface Reconstruction. J INTELL ROBOT SYST 2018. [DOI: 10.1007/s10846-018-0791-y] [Citation(s) in RCA: 20] [Impact Index Per Article: 3.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/25/2022]
|
36
|
Loianno G, Kumar V. Cooperative Transportation Using Small Quadrotors Using Monocular Vision and Inertial Sensing. IEEE Robot Autom Lett 2018. [DOI: 10.1109/lra.2017.2778018] [Citation(s) in RCA: 78] [Impact Index Per Article: 13.0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/06/2022]
|