Relative Localization for Aerial Manipulation with PL-SLAM

This chapter explains a precise SLAM technique, PL-SLAM, that allows to simultaneously process points and lines and tackle situations where point-only based methods are prone to fail, like poorly textured scenes or motion blurred images where feature poin

  • PDF / 546,626 Bytes
  • 10 Pages / 439.37 x 666.142 pts Page_size
  • 70 Downloads / 216 Views

DOWNLOAD

REPORT


Abstract This chapter explains a precise SLAM technique, PL-SLAM, that allows to simultaneously process points and lines and tackle situations where point-only based methods are prone to fail, like poorly textured scenes or motion blurred images where feature points are vanished out. The method is remarkably robust against image noise, and that it outperforms state-of-the-art methods for point based contour alignment. The method can run in real-time and in a low cost hardware.

1 Introduction The precise localization of an aerial robot is crucial for manipulation. In this section, we tackle the task of precise localization relative to a close up workspace for robot inspection and manipulation. The method requires robustness to poorly textured surfaces and, when the tracker is lost, relocalize the robot when passing over an already seen area. SLAM methods have proven effective to accurately estimate trajectories while keeping record of previously seen areas.

A. Pumarola (B) · A. Agudo · F. Moreno-Noguer · A. Sanfeliu CSIC-UPC, Institut de Robótica i Informática Industrial, Llorens i Artigas 4-6, 08028 Barcelona, Spain e-mail: [email protected] A. Agudo e-mail: [email protected] F. Moreno-Noguer e-mail: [email protected] A. Sanfeliu e-mail: [email protected] A. Vakhitov Skolkovo Institute of Science and Technology, Ulitsa Nobelya, 3, 121205 Moskva, Moscow Oblast, Russia e-mail: [email protected] © Springer Nature Switzerland AG 2019 A. Ollero and B. Siciliano (eds.), Aerial Robotic Manipulation, Springer Tracts in Advanced Robotics 129, https://doi.org/10.1007/978-3-030-12945-3_17

239

240

A. Pumarola et al.

Since the groundbreaking Parallel Tracking And Mapping (PTAM) [1] algorithm was introduced by Klein and Murray in 2007, many other real-time visual SLAM approaches have been proposed, including the feature point-based ORB-SLAM [2], and the direct-based methods LSD-SLAM [3] and RGBD-SLAM [4] that optimize directly over image pixels. Among them, the ORB-SLAM seems to be the current state-of-the-art, yielding better accuracy than the direct methods counterparts. However, it is prone to fail when dealing with poorly textured frames or when feature points are temporary vanished out due to, e.g., motion blur. This kind of situations are often encountered in man-made workspaces. However, despite the lack of reliable feature points, these environments may still contain a number of lines that can be used in a similar way. Building upon the ORB-SLAM framework, we present PL-SLAM (Point and Line SLAM) [5], a solution that can simultaneously leverage points and lines information. Lines are parameterized by their endpoints, whose exact location in the image plane is estimated following a two-step optimization process. This representation is robust to occlusions and mis-detections and enables integrating the line representation within the SLAM machinery. The resulting approach is very accurate in poorly textured environments, and also, improves the performance of ORB-SLAM in highly textured sequences.

2 PL