Feature-based visual navigation integrity monitoring for urban autonomous platforms
- PDF / 3,134,926 Bytes
- 13 Pages / 595.276 x 790.866 pts Page_size
- 71 Downloads / 256 Views
ORIGINAL PAPER
Feature‑based visual navigation integrity monitoring for urban autonomous platforms Shizhuang Wang1 · Xingqun Zhan1 · Yuanwen Fu1 · Yawei Zhai1 Received: 14 May 2020 / Revised: 13 July 2020 / Accepted: 29 July 2020 / Published online: 13 August 2020 © Shanghai Jiao Tong University 2020
Abstract Visual navigation systems have increasingly been adopted in many urban safety–critical applications, such as urban air mobility and highly automated vehicle, for which they must continuously provide accurate and safety-assured pose estimates. Extensive studies have focused on improving visual navigation accuracy and robustness in complex environment, while insufficient attention has been paid to ensuring navigation safety in the presence of outliers. From safety perspective, integrity is the most important navigation performance criterion because it measures the trust that can be placed in the correctness of the navigation output. Through leveraging the concept of integrity, this paper develops an integrity monitoring framework to protect visual navigation system against misleading measurements and to quantify the reliability of the navigation output. We firstly present the iterative least squares (LS)-based pose estimation algorithm and derive the associated covariance estimation methodology. Then we develop a two-layer fault detection scheme through combining random sampling consensus (RANSAC) with multiple hypotheses solution separation (MHSS) to achieve high efficiency and high reliability. Finally, the framework determines the probabilistic error bound of the navigation output that rigorously captures the undetected faults and the measurement uncertainty. The proposed algorithms are validated using various simulations, and the results suggest the promising performance. Keywords Visual navigation · Safety · Integrity · Fault detection · Autonomous systems
1 Introduction Vision-based or -aided navigation systems have attracted wide interest because of their outstanding performance in urban environments where Global Navigation Satellite System (GNSS) becomes seriously vulnerable [12]. Accordingly, visual navigation is an additional choice to realize the localization in urban safety–critical applications, such as urban air mobility (UAM) and highly automated vehicle (HAV), which are expected to bring great benefits to the society [1]. For these applications, ensuring the safety of visual localization is on the top priority when designing the navigation algorithms. This is because, on the one hand, visual navigation is highly sensitive to operating and environmental conditions, such as textures, presence of blurs and illumination * Xingqun Zhan [email protected] 1
School of Aeronautics and Astronautics, Shanghai Jiao Tong University, Shanghai 200240, China
changes. Therefore, the navigation system may perform well under some conditions, but in other environments, it might become unreliable. On the other hand, failure to correctly perform the localization task might lead to catastrophic damages to the
Data Loading...