Optical Self Diagnostics for Camera Based Driver Assistance

An increasing number of vehicles are equipped with cameras. As perception sensors, they scan the surrounding field and supply the Advanced Driver Assistance Systems (ADAS) for building up an environmental model through the use of computer vision technique

  • PDF / 512,034 Bytes
  • 12 Pages / 439.37 x 666.142 pts Page_size
  • 88 Downloads / 170 Views

DOWNLOAD

REPORT


Abstract An increasing number of vehicles are equipped with cameras. As perception sensors, they scan the surrounding field and supply the Advanced Driver Assistance Systems (ADAS) for building up an environmental model through the use of computer vision techniques. While they are performing well under good weather conditions their efficiency suffers under adverse environmental influences such as rain, fog and occlusion through dirt. As a consequence, the vision based ADAS has procured poor quality information, and the model also becomes faulty. This paper deals with methods to estimate information quality of cameras in order to warn the assistance system of possible wrong working conditions. In particular, situations of soiling or occlusion of the windshield or camera lens, as well as foggy weather are taken into account in this paper. In the issue of occlusion total, fractional and transparent effectuations have to be recognized and distinguished. Therefore this paper proposes an approach based on edge analysis of consecutive frames and presents initial experimental results of the implementation. In the field of Fog Detection a method based on the Logarithmic Image Processing Model is described and the results are shown. Keywords Self-optical diagnostic

 Camera  Fog detection  Occlusion

F2012-F07-008 H. Tadjine (&)  D. Anoushirvan  D. Eugen  K. Schulze IAV GmbH, Berlin, Germany

SAE-China and FISITA (eds.), Proceedings of the FISITA 2012 World Automotive Congress, Lecture Notes in Electrical Engineering 197, DOI: 10.1007/978-3-642-33805-2_40,  Springer-Verlag Berlin Heidelberg 2013

507

508

H Tadjine et al.

1 Introduction Through use of camera-based driver assistance systems versatile possibilities have evolved to support the driver. Both in-vehicle multipurpose cameras and exterior fish-eye parking cameras help to extend the drivers perception in various situations. Detecting pedestrians, road signs, other vehicles or generate bird’s eye view are just a few to mention. Likewise, the human optical perception, the ability to see is constricted by visibility conditions. Unfortunately in the real world there are plenty of situations and influences which can degrade camera sight, such as adverse weather conditions, lens occlusions through ice or dirt, blurring effects through water or translucent occlusions, cracks in the lens or windshield. Furthermore, there can be other failures caused by sensor defects or disruption during data transfer. To avoid malfunction of the ADAS it is therefore necessary to detect all kinds of failures and degradations of image quality and warn or deactivate affected modules. So the ADAS can be transferred to a fail-safe level and therefore the vehicle safety is increased. For this reason IAV is working on a selfdiagnostics module which provides camera-based ADAS with extensive image quality information regarding the current condition. In the field of image degradation due to bad weather many approaches for detection or restoration have been introduced. Grmer et al. [1–3] dea