Object Detection and Probabilistic Object Representation for Grasping with Two Arms
This chapter explains a technique for detection and grasping of objects using two arms of an aerial robot. The method robustly obtains the 6D pose of the robot to grasp it regardless the environment. The method is based on modeling the objects surfaces un
- PDF / 569,742 Bytes
- 11 Pages / 439.37 x 666.142 pts Page_size
- 18 Downloads / 180 Views
Abstract This chapter explains a technique for detection and grasping of objects using two arms of an aerial robot. The method robustly obtains the 6D pose of the robot to grasp it regardless the environment. The method is based on modeling the objects surfaces under the probabilistic framework of Gaussian Processes. A probabilistic framework has been proposed to tackle the problem of shape uncertainty when the robot has partial information about the object to be manipulated. This uncertainty is modeled using GPIS and evaluated using the quality metric: probability of force closure.
1 Introduction Object detection and surface modeling are crucial for any manipulation task. In this chapter, a method for modeling objects’ surface for grasping using a probabilistic representation is presented. Nowadays, 3D sensors are increasingly being used for robotics applications as they provide grateful information about the environment in the form of 3D point clouds. These data can be used with RGB information from monocular cameras to detect objects in the scene. However, these sensors may see their information distorted while working outdoors. For this reason, is it essential to smooth and model object’s surface in a way it can be used for grasping them. At first instance, an algorithm is introduced that uses 3D information from depth sensors to create a probabilistic model of object’s surface using Gaussian Process Implicit Surfaces (GPIS). The probabilistic information is used later for generating bi-manual grasps. Manipulation with robots has been studied extensively for a long time. Mathematical and mechanical models of robot hands and their interaction with the object are a P. Ramon Soria (B) · B. C. Arrue GRVC Robotics Lab Sevilla, Universidad de Sevilla, Sevilla, Spain e-mail: [email protected] B. C. Arrue e-mail: [email protected] © Springer Nature Switzerland AG 2019 A. Ollero and B. Siciliano (eds.), Aerial Robotic Manipulation, Springer Tracts in Advanced Robotics 129, https://doi.org/10.1007/978-3-030-12945-3_21
285
286 Table 1 Acronyms and symbols Definition
P. R. Soria and B. C. Arrue
Symbol
Mean function of GP Kernel function of GP Regressed values of a GP given observations Regressed covariance of GP given observation Mean values of observations a GP
m(·) K (·, ·) f∗ Σ∗ μ∗f
Mean values of GP on set points Values of observations used for regression Covariance of observations of a GP Partial covariance of observations and regression points Covariance of regression points
μf Y K ∗∗ K∗ K
fundamental aspect in the analysis of robotic manipulation. The huge amount of combinations of objects and hands configurations make this area of research challenging. Miller et al. [1] propose the use of primitive shapes that approximate the object, facilitating the generation of grasps. More recent research uses machine learning methods to generate better grasps, where the system is trained using synthetic datasets [2, 3] and reinforcement learning [4]. Additionally to the grasp generation, it is necessary to quantify their quality
Data Loading...