Dexterous Robotic-Hand Grasp Learning Using Piecewise Linear Dynamic Systems Model
Learning from sensor data plays an important role in the field of robotic research, especially in dexterous robotic hand grasping. The manuscript puts efforts on learning from tactile dynamic process during robotic hand grasping. A piecewise linear dynami
- PDF / 598,799 Bytes
- 11 Pages / 439.37 x 666.142 pts Page_size
- 45 Downloads / 170 Views
Abstract Learning from sensor data plays an important role in the field of robotic research, especially in dexterous robotic hand grasping. The manuscript puts efforts on learning from tactile dynamic process during robotic hand grasping. A piecewise linear dynamic systems and a group of models are presented, under the guidance of which, proper gesture according to different types of targets could then be selected to facilitate stable and accurate grasping. This is evaluated on the experimental testbed and shows promising results.
1 Introduction Robot research, including robotic hand and its applications, which is always in the forward position of the robot research frontier, has occupied researchers for two decades. Robotic hand can perform more accurate and sophisticated manipulation tasks under the guidance of artificial intelligence (AI). Just like humans have a good command of grasping after several attempts, with experiences accumulating, robotic hand can also exhibit ‘‘intelligence’’ and capabilities to a certain extent
W. Xiao (&) F. Sun H. Liu C. He Tsinghua National Laboratory for Information Science and Technology, The State Key Laboratory of Intelligent Technology and Systems, Department of Computer Science and Technology, Tsinghua University, Beijing 100084, People’s Republic of China e-mail: [email protected] F. Sun e-mail: [email protected] H. Liu e-mail: [email protected] C. He e-mail: [email protected]
F. Sun et al. (eds.), Foundations and Practical Applications of Cognitive Systems and Information Processing, Advances in Intelligent Systems and Computing 215, DOI: 10.1007/978-3-642-37835-5_73, Ó Springer-Verlag Berlin Heidelberg 2014
845
846
W. Xiao et al.
with the help of pattern analysis, machine learning, and other technologies within the AI domain. Grasping with dexterous robotic hand is a significant skill for robots, both in industrial and home area. As summarized in [1], for robotic hand, identifying suitable contact locations, hand pose (both position and orientation), and force-exertion strategies are confronted with three main sets of constraints: (a) constraints due to limited capabilities of the dexterous hand, (b) constraints due to object geometry and material characteristics, and (c) constraints due to the task requirements. Analyzing, modeling, and integrating multi-sensor data can greatly alleviate all these constraints mentioned before. Many efforts have been made on how to perform accurate and stable grasping with the help of some kinds of sensors, e.g., in [2] visual sensing has been widely used, and tactile sensing is used in [3]. Our robotic hand system setup is shown in Fig. 1a, and we are trying to learn from tactile data sampled by tactile sensors (see Fig. 1b) mounted on the surface of each finger and palm of robotic hand, and accomplish target stable grasping, using our 7 DOF SchunkTM manipulator and 4 DOF BarrettHandTM. After several training, the most probable object type of an unknown target could be inferred with its correspondin
Data Loading...