A Human-Machine Interface for Cooperative Highly Automated Driving

Cooperative perception of the traffic environment will enable Highly Automated Driving (HAD) functions to provide timelier and more complex Take-Over Requests (TOR) than it is possible with vehicle-localized perception alone. Furthermore, cooperative perc

  • PDF / 278,244 Bytes
  • 11 Pages / 439.37 x 666.142 pts Page_size
  • 43 Downloads / 181 Views

DOWNLOAD

REPORT


Abstract Cooperative perception of the traffic environment will enable Highly Automated Driving (HAD) functions to provide timelier and more complex Take-Over Requests (TOR) than it is possible with vehicle-localized perception alone. Furthermore, cooperative perception will extend automated vehicles’ capability of performing tactic and strategic maneuvers independently of any driver intervention (e.g., avoiding of obstacles). In this paper, resulting challenges to the design of the Human-Machine Interface (HMI) are discussed and a prototypical HMI is presented. The prototype is evaluated by experts from the field of cognitive ergonomics in a small-scale simulator study. Keywords Automated driving interface



Human-systems integration



Human-machine

1 Introduction Highly automated driving (HAD) functions are expected to be ready for marked introduction in the near future. By fusing vehicle-localized environment perception with information provided by other road users or infrastructure, so-called cooperative F. Naujoks (&)  A. Neukum Center for Traffic Sciences, University of Würzburg, Röntgenring 11, 97070 Würzburg, Germany e-mail: [email protected] A. Neukum e-mail: [email protected] Y. Forster  K. Wiedemann Würburg Institute for Traffic Sciences, Robert-Bosch-Straße 4, 97209 Veitshöchheim, Germany e-mail: [email protected] K. Wiedemann e-mail: [email protected] © Springer International Publishing Switzerland 2017 N.A. Stanton et al. (eds.), Advances in Human Aspects of Transportation, Advances in Intelligent Systems and Computing 484, DOI 10.1007/978-3-319-41682-3_49

585

586

F. Naujoks et al.

perception [1, 2], the capabilities of these systems can be greatly enhanced. For example, it may be possible to provide drivers with advance information about upcoming system limits to improve driving performance during take-over situations [3]. At the same time, cooperative perception will enable highly automated vehicles to handle some driving situations independently of any driver intervention by reacting strategically (e.g., lane change due to upcoming lane merge) or tactically (e.g., avoiding of obstacles on the road) to the situation. Thus, the HAD function eventually becomes an autonomous agent as it will perform driving maneuvers such as lane changes (e.g., because of merging lanes) or adaptations of the host vehicle’s speed (e.g., because of speed limits) independently. Humans tend to attribute rationality and intentionality to such autonomous agents which may become “team members” [4]. Interacting with autonomous agents places new demands on the human operator such as understanding of the system’s current behavior as well as its intentions [5, 6]. In view of these challenges, the development and evaluation of a HMI for cooperative HAD will be presented in this paper. The visual component of the HMI, that could be displayed on an in-vehicle display, is the focus of the paper. The HMI design is targeted on the basis of a shared control framework between driver and automati