Analysis on Observability and Performance of INS-Range Integrated Navigation System Under Urban Flight Environment

  • PDF / 3,722,026 Bytes
  • 13 Pages / 595.276 x 790.866 pts Page_size
  • 24 Downloads / 182 Views

DOWNLOAD

REPORT


ORIGINAL ARTICLE

Analysis on Observability and Performance of INS‑Range Integrated Navigation System Under Urban Flight Environment Byungjin Lee1 · Dong‑gyun Kim2 · Juhwan Lee2 · Sangkyung Sung3  Received: 17 February 2020 / Revised: 10 July 2020 / Accepted: 5 August 2020 / Published online: 11 August 2020 © The Korean Institute of Electrical Engineers 2020

Abstract This paper investigates the observability and navigation performance analysis of the integrated navigation system for the onboard operation in urban building forests. INS mechanization with some rangefinders is employed to achieve navigation performance with an urban geographical map. First, presented is a model of the range-inertial sensor integrated navigation system for the purpose of performing the observability analysis in a complex urban environment. Next, it is derived from the analysis formula of the filter observability based on the proposed system and measurement model. In order to examine the validity of the model, we analyzed the relationship between observability and estimation performance through simulation and flight experiments. In addition, a comparative study using the ICP matching based integration is presented. The simulation study employs a simple map environment for an intuitive analysis, where error characteristics are related to the observability rank. Finally, a practical test result using the multi-copter system is presented to address the correlation between observability and navigation performance through a real flight adjacent to tall buildings. Keywords  Observability · Navigation performance · Integrated navigation · Range · Inertial · Urban navigation

1 Introduction In recent years, laser range sensors are widely used in many applications including autonomous vehicles, ground robots, and drones. Notably, 2D or 3D LiDAR, which are an advanced form of the laser range sensor, provide point clouds, and these point clouds are utilized as a sampled image similar to vision cameras. Since the output of the laser sensor comes from ToF (time of flight) of light speed, the * Sangkyung Sung [email protected] Byungjin Lee [email protected] Dong‑gyun Kim [email protected] Juhwan Lee [email protected] 1



Defense Agency for Technology and Quality, Jinju, South Korea

2



Department of Aerospace Information System Engineering, Konkuk University, Seoul, South Korea

3

Department of Mechanical and Aerospace Engineering, Konkuk University, Seoul, South Korea



range information is more reliable than that from the depth image by the camera. The intense demands for automatic driving lead to the evolution of the laser sensors these days, and thus many studies are using the sensors. For navigation purposes, fixed geometry must be located around the sensors, so many studies are focused on the indoor or near wall environments [1–3]. Traditionally, SLAM (simultaneous localization and mapping) has been one of the leading research streams in the indoor or near wall environments, and the related studies using the laser ranging sensors