Automated cars meet human drivers: responsible human-robot coordination and the ethics of mixed traffic
- PDF / 856,565 Bytes
- 10 Pages / 595.276 x 790.866 pts Page_size
- 1 Downloads / 176 Views
ORIGINAL PAPER
Automated cars meet human drivers: responsible human-robot coordination and the ethics of mixed traffic Sven Nyholm1 · Jilles Smids1
© The Author(s) 2018. This article is an open access publication
Abstract In this paper, we discuss the ethics of automated driving. More specifically, we discuss responsible human-robot coordination within mixed traffic: i.e. traffic involving both automated cars and conventional human-driven cars. We do three main things. First, we explain key differences in robotic and human agency and expectation-forming mechanisms that are likely to give rise to compatibility-problems in mixed traffic, which may lead to crashes and accidents. Second, we identify three possible solution-strategies for achieving better human-robot coordination within mixed traffic. Third, we identify important ethical challenges raised by each of these three possible strategies for achieving optimized human-robot cordination in this domain. Among other things, we argue that we should not just explore ways of making robotic driving more like human driving. Rather, we ought also to take seriously potential ways (e.g. technological means) of making human driving more like robotic driving. Nor should we assume that complete automation is always the ideal to aim for; in some traffic-situations, the best results may be achieved through human-robot collaboration. Ultimately, our main aim in this paper is to argue that the new field of the ethics of automated driving needs take seriously the ethics of mixed traffic and responsible human-robot coordination. Keywords Human-robot coordination · Automated driving · Ethics · Responsible robotics · Agency
Introduction Before 2015, discussions of crashes involving automated vehicles were largely hypothetical. However, with increased road-testing of automated vehicles, real world crashes soon started happening, with just under 20 cases in 2015. The initial crashes were primarily instances of conventional cars rear-ending slow-moving automated vehicles. And there was little damage done (Schoettle and Sivak 2015a). However, in 2016 there were some more dramatic developments. On Valentine’s day (February 14), there was a not very romantic encounter between a “self-driving” Google-car and a bus. The former crashed into the latter. And on this occasion, Google had to assume responsibility for the collision, which was the first time that happened (Urmson 2016). More tragically, the first person was killed in a crash with a vehicle operating in automated mode in May. A Tesla Model S in “autopilot” mode collided with a truck that the car’s sensors * Sven Nyholm [email protected] 1
Eindhoven University of Technology, Eindhoven, The Netherlands
had not detected (Tesla 2016). What all these crashes so far—both those in 2015 and 2016—have in common is that they were collisions between automated cars and conventional cars. They were crashes in “mixed traffic.” This paper is a contribution to the new field of the ethics of automated driving (e.g. Goodall 2014a, b; Lin 2
Data Loading...