Tracking and grasping of moving target based on accelerated geometric particle filter on colored image
- PDF / 4,019,178 Bytes
- 12 Pages / 595.276 x 793.701 pts Page_size
- 99 Downloads / 193 Views
acking and grasping of moving target based on accelerated geometric particle filter on colored image *
GONG ZeYu, QIU ChunRong, TAO Bo , BAI HaiSheng, YIN ZhouPing & DING Han State Key Laboratory of Digital Manufacturing Equipment and Technology, Huazhong University of Science and Technology, Wuhan 430074, China Received April 21, 2020; accepted July 9, 2020; published online September 15, 2020
Visual tracking and grasping of moving object is a challenging task in the field of robotic manipulation, which also has great potential in applications such as human-robot collaboration. Based on the particle filtering framework and position-based visual servoing, this paper proposes a new method for visual tracking and grasping of randomly moving objects. A geometric particle filter tracker is established for visual tracking. In order to deal with the tracking efficiency issue for particle filter, edge detection and morphological dilation are employed to reduce the computation burden of geometric particle filtering. Meanwhile, the HSV image feature is employed instead of the grayscale feature to improve the tracking algorithm’s robustness to illumination change. A grasping strategy combining tracking and interception is adopted along with the position-based visual servoing (PBVS) method to achieve stable grasp of the target. Comprehensive comparisons on open source dataset and a large number of experiments on real robot system are conducted, which demonstrate the proposed method has competitive performance in random moving object tracking and grasping. visual tracking, robotic grasping, geometric particle filtering, visual servoing Citation:
1
Gong Z Y, Qiu C R, Tao B, et al. Tracking and grasping of moving target based on accelerated geometric particle filter on colored image. Sci China Tech Sci, 2020, 63, https://doi.org/10.1007/s11431-020-1688-2
Introduction
Visually guided grasping is by far the most mature solution in the field of robotic grasping, and has been extensively used in numerous industrial scenarios [1]. In recent decades, the visual guided grasping of stationary object has been thoroughly studied, and achieved a lot of remarkable results [2–4]. However, comparatively less work has been done on the grasping of moving objects [5], which is recognized as a more challenging topic with equally important applications, such as human-robot collaboration [5], dynamic assembly [6] and space robot manipulation [7]. The challenge of visual tracking and grasping in this type of applications mainly comes from the uncertainty of the robot’s own or the target’s
random movement, which result in higher requirements for the real-time and robustness of the tracking algorithm. In recent years, researchers have also achieved many impressive research results in the field of moving objects tracking and grasping. Besides the most common scenarios of picking and placing goods on single-dimensional moving conveyor belts in industry [8–10], one of the most representative application is the catching of free flying object. Billard
Data Loading...