1
|
Elamin A, Abdelaziz N, El-Rabbany A. A GNSS/INS/LiDAR Integration Scheme for UAV-Based Navigation in GNSS-Challenging Environments. SENSORS (BASEL, SWITZERLAND) 2022; 22:s22249908. [PMID: 36560277 PMCID: PMC9786841 DOI: 10.3390/s22249908] [Citation(s) in RCA: 1] [Impact Index Per Article: 0.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Subscribe] [Scholar Register] [Received: 11/11/2022] [Revised: 12/07/2022] [Accepted: 12/14/2022] [Indexed: 06/12/2023]
Abstract
Unmanned aerial vehicle (UAV) navigation has recently been the focus of many studies. The most challenging aspect of UAV navigation is maintaining accurate and reliable pose estimation. In outdoor environments, global navigation satellite systems (GNSS) are typically used for UAV localization. However, relying solely on GNSS might pose safety risks in the event of receiver malfunction or antenna installation error. In this research, an unmanned aerial system (UAS) employing the Applanix APX15 GNSS/IMU board, a Velodyne Puck LiDAR sensor, and a Sony a7R II high-resolution camera was used to collect data for the purpose of developing a multi-sensor integration system. Unfortunately, due to a malfunctioning GNSS antenna, there were numerous prolonged GNSS signal outages. As a result, the GNSS/INS processing failed after obtaining an error that exceeded 25 km. To resolve this issue and to recover the precise trajectory of the UAV, a GNSS/INS/LiDAR integrated navigation system was developed. The LiDAR data were first processed using the optimized LOAM SLAM algorithm, which yielded the position and orientation estimates. Pix4D Mapper software was then used to process the camera images in the presence of ground control points (GCPs), which resulted in the precise camera positions and orientations that served as ground truth. All sensor data were timestamped by GPS, and all datasets were sampled at 10 Hz to match those of the LiDAR scans. Two case studies were considered, namely complete GNSS outage and assistance from GNSS PPP solution. In comparison to the complete GNSS outage, the results for the second case study were significantly improved. The improvement is described in terms of RMSE reductions of approximately 51% and 78% for the horizontal and vertical directions, respectively. Additionally, the RMSE of the roll and yaw angles was reduced by 13% and 30%, respectively. However, the RMSE of the pitch angle was increased by about 13%.
Collapse
Affiliation(s)
- Ahmed Elamin
- Department of Civil Engineering, Toronto Metropolitan University, Toronto, ON M5B 2K3, Canada
- Department of Civil Engineering, Faculty of Engineering, Zagazig University, Zagazig 44519, Egypt
| | - Nader Abdelaziz
- Department of Civil Engineering, Toronto Metropolitan University, Toronto, ON M5B 2K3, Canada
- Department of Civil Engineering, Tanta University, Tanta 31527, Egypt
| | - Ahmed El-Rabbany
- Department of Civil Engineering, Toronto Metropolitan University, Toronto, ON M5B 2K3, Canada
| |
Collapse
|
2
|
Yin H, Chen C, Hao C, Huang B. A Vision-based inventory method for stacked goods in stereoscopic warehouse. Neural Comput Appl 2022. [DOI: 10.1007/s00521-022-07551-4] [Citation(s) in RCA: 0] [Impact Index Per Article: 0] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 10/16/2022]
|
3
|
Elmokadem T, Savkin AV. Towards Fully Autonomous UAVs: A Survey. SENSORS 2021; 21:s21186223. [PMID: 34577430 PMCID: PMC8473245 DOI: 10.3390/s21186223] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 08/09/2021] [Revised: 09/09/2021] [Accepted: 09/09/2021] [Indexed: 11/17/2022]
Abstract
Unmanned Aerial Vehicles have undergone rapid developments in recent decades. This has made them very popular for various military and civilian applications allowing us to reach places that were previously hard to reach in addition to saving time and lives. A highly desirable direction when developing unmanned aerial vehicles is towards achieving fully autonomous missions and performing their dedicated tasks with minimum human interaction. Thus, this paper provides a survey of some of the recent developments in the field of unmanned aerial vehicles related to safe autonomous navigation, which is a very critical component in the whole system. A great part of this paper focus on advanced methods capable of producing three-dimensional avoidance maneuvers and safe trajectories. Research challenges related to unmanned aerial vehicle development are also highlighted.
Collapse
|
4
|
Krátký V, Petráček P, Báča T, Saska M. An autonomous unmanned aerial vehicle system for fast exploration of large complex indoor environments. J FIELD ROBOT 2021. [DOI: 10.1002/rob.22021] [Citation(s) in RCA: 9] [Impact Index Per Article: 2.3] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
Affiliation(s)
- Vít Krátký
- Department of Cybernetics, Faculty of Electrical Engineering Czech Technical University in Prague Praha Czech Republic
| | - Pavel Petráček
- Department of Cybernetics, Faculty of Electrical Engineering Czech Technical University in Prague Praha Czech Republic
| | - Tomáš Báča
- Department of Cybernetics, Faculty of Electrical Engineering Czech Technical University in Prague Praha Czech Republic
| | - Martin Saska
- Department of Cybernetics, Faculty of Electrical Engineering Czech Technical University in Prague Praha Czech Republic
| |
Collapse
|
5
|
Lluvia I, Lazkano E, Ansuategi A. Active Mapping and Robot Exploration: A Survey. SENSORS (BASEL, SWITZERLAND) 2021; 21:2445. [PMID: 33918107 PMCID: PMC8037480 DOI: 10.3390/s21072445] [Citation(s) in RCA: 16] [Impact Index Per Article: 4.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 02/11/2021] [Revised: 03/21/2021] [Accepted: 03/28/2021] [Indexed: 11/16/2022]
Abstract
Simultaneous localization and mapping responds to the problem of building a map of the environment without any prior information and based on the data obtained from one or more sensors. In most situations, the robot is driven by a human operator, but some systems are capable of navigating autonomously while mapping, which is called native simultaneous localization and mapping. This strategy focuses on actively calculating the trajectories to explore the environment while building a map with a minimum error. In this paper, a comprehensive review of the research work developed in this field is provided, targeting the most relevant contributions in indoor mobile robotics.
Collapse
Affiliation(s)
- Iker Lluvia
- Autonomous and Intelligent Systems Unit, Fundación Tekniker, 20600 Eibar, Gipuzkoa, Spain;
| | - Elena Lazkano
- Robotics and Autonomous Systems Group (RSAIT), Computer Science and Artificial Intelligence Department, Faculty of Informatics, University of the Basque Country (UPV/EHU), 20018 Donostia, Gipuzkoa, Spain;
| | - Ander Ansuategi
- Autonomous and Intelligent Systems Unit, Fundación Tekniker, 20600 Eibar, Gipuzkoa, Spain;
| |
Collapse
|
6
|
Chen SW, Nardari GV, Lee ES, Qu C, Liu X, Romero RAF, Kumar V. SLOAM: Semantic Lidar Odometry and Mapping for Forest Inventory. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2019.2963823] [Citation(s) in RCA: 47] [Impact Index Per Article: 9.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/09/2022]
|
7
|
Petracek P, Kratky V, Saska M. Dronument: System for Reliable Deployment of Micro Aerial Vehicles in Dark Areas of Large Historical Monuments. IEEE Robot Autom Lett 2020. [DOI: 10.1109/lra.2020.2969935] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.6] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/05/2022]
|
8
|
Hinas A, Ragel R, Roberts J, Gonzalez F. A Framework for Multiple Ground Target Finding and Inspection Using a Multirotor UAS. SENSORS 2020; 20:s20010272. [PMID: 31947777 PMCID: PMC6982733 DOI: 10.3390/s20010272] [Citation(s) in RCA: 5] [Impact Index Per Article: 1.0] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 11/26/2019] [Revised: 12/27/2019] [Accepted: 12/31/2019] [Indexed: 11/16/2022]
Abstract
Small unmanned aerial systems (UASs) now have advanced waypoint-based navigation capabilities, which enable them to collect surveillance, wildlife ecology and air quality data in new ways. The ability to remotely sense and find a set of targets and descend and hover close to each target for an action is desirable in many applications, including inspection, search and rescue and spot spraying in agriculture. This paper proposes a robust framework for vision-based ground target finding and action using the high-level decision-making approach of Observe, Orient, Decide and Act (OODA). The proposed framework was implemented as a modular software system using the robotic operating system (ROS). The framework can be effectively deployed in different applications where single or multiple target detection and action is needed. The accuracy and precision of camera-based target position estimation from a low-cost UAS is not adequate for the task due to errors and uncertainties in low-cost sensors, sensor drift and target detection errors. External disturbances such as wind also pose further challenges. The implemented framework was tested using two different test cases. Overall, the results show that the proposed framework is robust to localization and target detection errors and able to perform the task.
Collapse
Affiliation(s)
- Ajmal Hinas
- Robotics and Autonomous Systems, Queensland University of Technology (QUT), Brisbane City QLD 4000, Australia; (J.R.); (F.G.)
- Correspondence: or ; Tel.: +61-0470534175
| | - Roshan Ragel
- Department of Computer Engineering, University of Peradeniya (UOP), Peradeniya 20400, Sri Lanka;
| | - Jonathan Roberts
- Robotics and Autonomous Systems, Queensland University of Technology (QUT), Brisbane City QLD 4000, Australia; (J.R.); (F.G.)
| | - Felipe Gonzalez
- Robotics and Autonomous Systems, Queensland University of Technology (QUT), Brisbane City QLD 4000, Australia; (J.R.); (F.G.)
| |
Collapse
|
9
|
Maciel-Pearson BG, Akcay S, Atapour-Abarghouei A, Holder C, Breckon TP. Multi-Task Regression-Based Learning for Autonomous Unmanned Aerial Vehicle Flight Control Within Unstructured Outdoor Environments. IEEE Robot Autom Lett 2019. [DOI: 10.1109/lra.2019.2930496] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/07/2022]
|
10
|
Forward and Backward Visual Fusion Approach to Motion Estimation with High Robustness and Low Cost. REMOTE SENSING 2019. [DOI: 10.3390/rs11182139] [Citation(s) in RCA: 5] [Impact Index Per Article: 0.8] [Reference Citation Analysis] [Abstract] [Track Full Text] [Subscribe] [Scholar Register] [Indexed: 11/16/2022]
Abstract
We present a novel low-cost visual odometry method of estimating the ego-motion (self-motion) for ground vehicles by detecting the changes that motion induces on the images. Different from traditional localization methods that use differential global positioning system (GPS), precise inertial measurement unit (IMU) or 3D Lidar, the proposed method only leverage data from inexpensive visual sensors of forward and backward onboard cameras. Starting with the spatial-temporal synchronization, the scale factor of backward monocular visual odometry was estimated based on the MSE optimization method in a sliding window. Then, in trajectory estimation, an improved two-layers Kalman filter was proposed including orientation fusion and position fusion . Where, in the orientation fusion step, we utilized the trajectory error space represented by unit quaternion as the state of the filter. The resulting system enables high-accuracy, low-cost ego-pose estimation, along with providing robustness capability of handing camera module degradation by automatic reduce the confidence of failed sensor in the fusion pipeline. Therefore, it can operate in the presence of complex and highly dynamic motion such as enter-in-and-out tunnel entrance, texture-less, illumination change environments, bumpy road and even one of the cameras fails. The experiments carried out in this paper have proved that our algorithm can achieve the best performance on evaluation indexes of average in distance (AED), average in X direction (AEX), average in Y direction (AEY), and root mean square error (RMSE) compared to other state-of-the-art algorithms, which indicates that the output results of our approach is superior to other methods.
Collapse
|
11
|
Efficient Lazy Theta* Path Planning over a Sparse Grid to Explore Large 3D Volumes with a Multirotor UAV. SENSORS 2019; 19:s19010174. [PMID: 30621305 PMCID: PMC6339096 DOI: 10.3390/s19010174] [Citation(s) in RCA: 13] [Impact Index Per Article: 2.2] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 12/07/2018] [Revised: 12/31/2018] [Accepted: 12/31/2018] [Indexed: 11/29/2022]
Abstract
Exploring large, unknown, and unstructured environments is challenging for Unmanned Aerial Vehicles (UAVs), but they are valuable tools to inspect large structures safely and efficiently. The Lazy Theta* path-planning algorithm is revisited and adapted to generate paths fast enough to be used in real time and outdoors in large 3D scenarios. In real unknown scenarios, a given minimum safety distance to the nearest obstacle or unknown space should be observed, increasing the associated obstacle detection queries, and creating a bottleneck in the path-planning algorithm. We have reduced the dimension of the problem by considering geometrical properties to speed up these computations. On the other hand, we have also applied a non-regular grid representation of the world to increase the performance of the path-planning algorithm. In particular, a sparse resolution grid in the form of an octree is used, organizing the measurements spatially, merging voxels when they are of the same state. Additionally, the number of neighbors is trimmed to match the sparse tree to reduce the number of obstacle detection queries. The development methodology adopted was Test-Driven Development (TDD) and the outcome was evaluated in real outdoors flights with a multirotor UAV. In the results, the performance shows over 90 percent decrease in overall path generation computation time. Furthermore, our approach scales well with the safety distance increases.
Collapse
|
12
|
Visual-Based SLAM Configurations for Cooperative Multi-UAV Systems with a Lead Agent: An Observability-Based Approach. SENSORS 2018; 18:s18124243. [PMID: 30513949 PMCID: PMC6308766 DOI: 10.3390/s18124243] [Citation(s) in RCA: 4] [Impact Index Per Article: 0.6] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 10/14/2018] [Revised: 11/28/2018] [Accepted: 11/28/2018] [Indexed: 11/17/2022]
Abstract
In this work, the problem of the cooperative visual-based SLAM for the class of multi-UA systems that integrates a lead agent has been addressed. In these kinds of systems, a team of aerial robots flying in formation must follow a dynamic lead agent, which can be another aerial robot, vehicle or even a human. A fundamental problem that must be addressed for these kinds of systems has to do with the estimation of the states of the aerial robots as well as the state of the lead agent. In this work, the use of a cooperative visual-based SLAM approach is studied in order to solve the above problem. In this case, three different system configurations are proposed and investigated by means of an intensive nonlinear observability analysis. In addition, a high-level control scheme is proposed that allows to control the formation of the UAVs with respect to the lead agent. In this work, several theoretical results are obtained, together with an extensive set of computer simulations which are presented in order to numerically validate the proposal and to show that it can perform well under different circumstances (e.g., GPS-challenging environments). That is, the proposed method is able to operate robustly under many conditions providing a good position estimation of the aerial vehicles and the lead agent as well.
Collapse
|
13
|
Trujillo JC, Munguia R, Guerra E, Grau A. Cooperative Monocular-Based SLAM for Multi-UAV Systems in GPS-Denied Environments. SENSORS 2018; 18:s18051351. [PMID: 29701722 PMCID: PMC5981868 DOI: 10.3390/s18051351] [Citation(s) in RCA: 24] [Impact Index Per Article: 3.4] [Reference Citation Analysis] [Abstract] [Key Words] [Track Full Text] [Download PDF] [Figures] [Subscribe] [Scholar Register] [Received: 01/21/2018] [Revised: 04/03/2018] [Accepted: 04/19/2018] [Indexed: 11/16/2022]
Abstract
This work presents a cooperative monocular-based SLAM approach for multi-UAV systems that can operate in GPS-denied environments. The main contribution of the work is to show that, using visual information obtained from monocular cameras mounted onboard aerial vehicles flying in formation, the observability properties of the whole system are improved. This fact is especially notorious when compared with other related visual SLAM configurations. In order to improve the observability properties, some measurements of the relative distance between the UAVs are included in the system. These relative distances are also obtained from visual information. The proposed approach is theoretically validated by means of a nonlinear observability analysis. Furthermore, an extensive set of computer simulations is presented in order to validate the proposed approach. The numerical simulation results show that the proposed system is able to provide a good position and orientation estimation of the aerial vehicles flying in formation.
Collapse
Affiliation(s)
- Juan-Carlos Trujillo
- Department of Computer Science, CUCEI, University of Guadalajara, Guadalajara 44430, Mexico.
| | - Rodrigo Munguia
- Department of Computer Science, CUCEI, University of Guadalajara, Guadalajara 44430, Mexico.
| | - Edmundo Guerra
- Department of Automatic Control, Technical University of Catalonia UPC, 08034 Barcelona, Spain.
| | - Antoni Grau
- Department of Automatic Control, Technical University of Catalonia UPC, 08034 Barcelona, Spain.
| |
Collapse
|
14
|
Faria M, Maza I, Viguria A. Applying Frontier Cells Based Exploration and Lazy Theta* Path Planning over Single Grid-Based World Representation for Autonomous Inspection of Large 3D Structures with an UAS. J INTELL ROBOT SYST 2018. [DOI: 10.1007/s10846-018-0798-4] [Citation(s) in RCA: 17] [Impact Index Per Article: 2.4] [Reference Citation Analysis] [Track Full Text] [Journal Information] [Subscribe] [Scholar Register] [Indexed: 11/30/2022]
|