High Level Sensor Data Fusion of Radar and Lidar for Car-Following on Highways
We present a real-time algorithm which enables an autonomous car to comfortably follow other cars at various speeds while keeping a safe distance. We focus on highway scenarios. A velocity and distance regulation approach is presented that depends on the
- PDF / 2,059,348 Bytes
- 14 Pages / 439.37 x 666.142 pts Page_size
- 64 Downloads / 143 Views
Abstract We present a real-time algorithm which enables an autonomous car to comfortably follow other cars at various speeds while keeping a safe distance. We focus on highway scenarios. A velocity and distance regulation approach is presented that depends on the position as well as the velocity of the followed car. Radar sensors provide reliable information on straight lanes, but fail in curves due to their restricted field of view. On the other hand, lidar sensors are able to cover the regions of interest in almost all situations, but do not provide precise speed information. We combine the advantages of both sensors with a sensor fusion approach in order to provide permanent and precise spatial and dynamical data. Our results in highway experiments with real traffic will be described in detail.
1 Introduction The interest in autonomous cars has grown in recent years, as they provide new insights for general robotic systems in areas like safety, machine learning and environmental perception [1]. This work is based on Radar/Lidar Sensor Fusion for Car-Following on Highways, by Daniel Göhring, Miao Wang, Michael Schnürmacher, and Tinosch Ganjineh which appeared in the Proceedings of the M. Schnürmacher (&) D. Göhring M. Wang T. Ganjineh Freie Universität Berlin, Institut für Informatik, Arnimallee 7 14195 Berlin, Germany e-mail: [email protected] D. Göhring e-mail: [email protected] M. Wang e-mail: [email protected] T. Ganjineh e-mail: [email protected]
G. Sen Gupta et al. (eds.), Recent Advances in Robotics and Automation, Studies in Computational Intelligence 480, DOI: 10.1007/978-3-642-37387-9_17, Ó Springer-Verlag Berlin Heidelberg 2013
217
218
M. Schnürmacher et al.
5th International Conference on Automation, Robotics and Applications (ICARA 2011). Ó 2011 IEEE. One key aspect of driving autonomous cars is the detection of obstacles in order to ensure safety and to prevent collisions with other traffic users. The main challenge here is to keep track of the followed vehicle even at big distances, winding roads, and high speeds. The application also demands for a precise estimation of other cars’ position and velocity. The approach presented here fuses data from lidar scanners and from a long range radar. There exist a broad spectrum of sensors (including radar, ultrasonic, lidar, camera, etc.) that are suitable for object detection. Unfortunately, data from one specific sensor is usually not enough to infer all information needed for complete autonomous driving under all possible conditions. For that reason sensor data fusion has been widely used in order to construct more reliable and robust systems. A review of state-of-the-art techniques and problems in multisensor data fusion is given in [2]. The fusion of camera and lidar data is one of the most popular approaches and has been studied, e.g., in [3]. An approach for obstacle detection and tracking combining stereo vision and radar is presented in [4]. The fusion of lidar and radar data has also been studied earlier.
Data Loading...