Shaping Trust Through Transparent Design: Theoretical and Experimental Guidelines
The current research discusses transparency as a means to enable trust of automated systems. Commercial pilots (N = 13) interacted with an automated aid for emergency landings. The automated aid provided decision support during a complex task where pilots
- PDF / 361,558 Bytes
- 10 Pages / 439.37 x 666.142 pts Page_size
- 62 Downloads / 190 Views
J.B. Lyons (&) Air Force Research Laboratory, Wright-Patterson AFB, Dayton 45433, OH, USA e-mail: [email protected] G.G. Sadler K. Koltai H. Battiste N.T. Ho L.C. Hoffmann NVH Human Systems Integration, Canoga Park, Los Angeles, CA, USA e-mail: [email protected] K. Koltai e-mail: [email protected] H. Battiste e-mail: [email protected] N.T. Ho e-mail: [email protected] L.C. Hoffmann e-mail: [email protected] D. Smith W. Johnson R. Shively NASA Ames Research Center, Moffett Field, Los Angeles, CA, USA e-mail: [email protected] W. Johnson e-mail: [email protected] R. Shively e-mail: [email protected] © Springer International Publishing Switzerland 2017 P. Savage-Knepshield and J. Chen (eds.), Advances in Human Factors in Robots and Unmanned Systems, Advances in Intelligent Systems and Computing 499, DOI 10.1007/978-3-319-41959-6_11
127
128
J.B. Lyons et al.
sistent with prior studies in this area. Implications for design are discussed in terms of promoting understanding of the rationale for automated recommendations. Keywords Trust
Transparency Automation
1 Introduction Advanced technology has great promise to support improved task performance across a variety of domains. Yet, advances in technologies such as automation, while beneficial to performance in stable (high-reliability) states, can have detrimental effects when they fail [1]. One paradoxical reason why automation can be devastating is that humans may form inappropriate reliance strategies when working with automation [2, 3]. Thus, the issue of trust in automation has emerged as an important topic for human factors researchers [4, 5]. Trust is a critical process to understand because trust has implications for reliance behavior—i.e., using or “relying” on a system when that reliance matters most. The trust process as it relates to automation is complex because the factors that influence trust range from human-centric factors such as dispositional influences (e.g. predisposition to trust) and experiential influences (learned trust aspects), to situational features [see 5 for a recent review]. Failure to establish appropriate trust can result in performance errors due to over-trust in technology where a human places unwarranted reliance on a technology, or alternatively, humans can under-trust technology by failing to use technology when that reliance is warranted. One key for researchers is to identify the set of variables that influences the trust process and to provide humans with the appropriate information to drive appropriate reliance decisions. The current paper discusses one such influence, the role of transparency and its influence on the trust process by presenting experimental data related to different transparency manipulations in a high-fidelity, immersive commercial aviation task environment involving automation support to a pilot. Transparency represents a method for establishing shared awareness and shared intent between humans and machines [6]. Transparency is essentially a
Data Loading...