Trust in Robots: Challenges and Opportunities

  • PDF / 1,154,869 Bytes
  • 13 Pages / 595.276 x 790.866 pts Page_size
  • 75 Downloads / 209 Views

DOWNLOAD

REPORT


SERVICE AND INTERACTIVE ROBOTICS (A TAPUS, SECTION EDITOR)

Trust in Robots: Challenges and Opportunities Bing Cai Kok 1 & Harold Soh 1

# Springer Nature Switzerland AG 2020

Abstract Purpose of Review To assess the state-of-the-art in research on trust in robots and to examine if recent methodological advances can aid in the development of trustworthy robots. Recent Findings While traditional work in trustworthy robotics has focused on studying the antecedents and consequences of trust in robots, recent work has gravitated towards the development of strategies for robots to actively gain, calibrate, and maintain the human user’s trust. Among these works, there is emphasis on endowing robotic agents with reasoning capabilities (e.g., via probabilistic modeling). Summary The state-of-the-art in trust research provides roboticists with a large trove of tools to develop trustworthy robots. However, challenges remain when it comes to trust in real-world human-robot interaction (HRI) settings: there exist outstanding issues in trust measurement, guarantees on robot behavior (e.g., with respect to user privacy), and handling rich multidimensional data. We examine how recent advances in psychometrics, trustworthy systems, robot-ethics, and deep learning can provide resolution to each of these issues. In conclusion, we are of the opinion that these methodological advances could pave the way for the creation of truly autonomous, trustworthy social robots. Keywords Trust . Human-robot interaction . Probabilistic models . Measurement . Formal methods

Introduction On July 2, 1994, USAir Flight 1016 was scheduled to land in the Douglas International Airport in Charlotte, NC. Upon nearing the airport, the plane experienced inclement weather and was affected by wind shear (a sudden change in wind velocity that can destabilize an aircraft). On the ground, a wind shear alert system installed at the airport issued a total of three warnings to the air traffic controller. But due to a lack of trust in the alert system, the air traffic controller transmitted only one of the alarms that was, unfortunately, never received by the plane. Unaware of the presence of wind shear, the aircrew failed to react appropriately and the plane crashed, This article belongs to the Topical Collection on Service and Interactive Robotics * Harold Soh [email protected] Bing Cai Kok [email protected] 1

Dept. of Computer Science, School of Computing, National University of Singapore, 13 Computing Drive, Singapore 119077, Singapore

killing 37 people [1] (see Fig. 1). This tragedy vividly brings to focus the critical role of trust in automation (and by extension, robots): a lack of trust can lead to disuse, with potentially dire consequences. Had the air traffic controller trusted the alert system and transmitted all three warnings, the tragedy may have been averted. Human-robot trust is crucial in today’s world where modern social robots are increasingly being deployed. In healthcare, robots are used for patient rehabilitation [2] and to provide f