An Autonomous Visual-Inertial-Based Navigation System for Quadrotor
In this paper, we present a practical autonomous navigation system based on the visual-inertial of a quadrotor. Due to the practical engineering requirement of improving the applicability of the advanced visual-inertial navigation fusion algorithm and 3D
- PDF / 1,447,225 Bytes
- 11 Pages / 439.37 x 666.142 pts Page_size
- 10 Downloads / 241 Views
Abstract. In this paper, we present a practical autonomous navigation system based on the visual-inertial of a quadrotor. Due to the practical engineering requirement of improving the applicability of the advanced visual-inertial navigation fusion algorithm and 3D mapping algorithm, we realize the on-line 3D trajectory planning and tracking control algorithm with full consideration of UAV dynamics design, and finally complete the quadrotor autonomous navigation system consisting of UAV, upper computer and other software and hardware components. The feasibility is verified by actual flight experiments. The results show that the quadrotor autonomous navigation system can achieve high-precision positioning, online 3D reconstruction and dynamic autonomous navigation in a complex unknown environment without GPS. The system has good accuracy and robustness in real-time, which provides a strong technical support for the subsequent expansion of platform function. Keywords: Quadrotor · Visual-inertial planning · Autonomous navigation
1
· TSDF · Trajectory
Introduction
With the intelligent development and application of UAV, more and more attention has been paid to autonomous navigation, which is an important application of the intelligent realization of UAV. Accurate position and attitude estimation is the premises of the autonomous navigation of UAV. Common positioning methods include such as global positioning system (GPS), motion capture system, lidar, and vision, etc. Compared with other positioning methods, the visual method has the advantages of small size, low power consumption, low price, and abundant scene information, and gradually becomes the mainstream of mobile robot positioning methods. However, the stability and robustness of visual-only Simultaneous Localization And Mapping (SLAM) [14] system need to be improved in the environment of illumination variations and motion blur. To make up for the shortcomings of visual-only SLAM system, and considering the complementarity of Inertial Measurement Unit (IMU) and visual sensor, a c The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2021 Y. Jia et al. (Eds.): CISC 2020, LNEE 706, pp. 404–414, 2021. https://doi.org/10.1007/978-981-15-8458-9_43
Navigation System
405
Visual-Inertial Odometry (VIO) with better stability, robustness and accuracy are obtained by fusing the two. Motion planning is also an essential part of the UAV integrated navigation system. With the diversification of UAV application scenarios, the path planning scheme of common robots cannot meet the rapid navigation requirements of UAV in the 3D unknown environment. The complex movement and dynamics of the flight system, online real-time planning ability, safety feasibility, and integrity of planning results need to be considered in motion planning. Scholarly works on UAV navigation are extensive. As early as 2011, Kumar et al. show autonomous micro quadrotor navigation integrated by multi-sensor such as monocular camera, IMU, and lidar [1
Data Loading...