Operator-Autonomy Teaming Interfaces to Support Multi-Unmanned Vehicle Missions

Advances in automation technology are leading to the development of operational concepts in which a single operator teams with multiple autonomous vehicles. This requires the design and evaluation of interfaces that support operator-autonomy collaboration

  • PDF / 1,005,392 Bytes
  • 14 Pages / 439.37 x 666.142 pts Page_size
  • 89 Downloads / 205 Views

DOWNLOAD

REPORT


Abstract Advances in automation technology are leading to the development of operational concepts in which a single operator teams with multiple autonomous vehicles. This requires the design and evaluation of interfaces that support operator-autonomy collaborations. This paper describes interfaces designed to support a base defense mission performed by a human operator and heterogeneous unmanned vehicles. Flexible operator-autonomy teamwork is facilitated with interfaces that highlight the tradeoffs of autonomy-generated plans, support allocation of assets to tasks, and communicate mission progress. The interfaces include glyphs and video gaming type icons that present information in a concise, integrated manner and multi-modal controls that augment an adaptable architecture to enable seamless transition across control levels, from manual to fully autonomous. Examples of prototype displays and controls are provided, as well as usability data collected from multi-task simulation evaluations.



Keywords Flexible automation Adaptable automation Autonomous vehicles Operator-Autonomy Interfaces Display symbology Human factors







 Unmanned systems   Multi-modal control 

G.L. Calhoun (&) Air Force Research Laboratory, 711 HPW/RHCI, Dayton, OH, USA e-mail: [email protected] H.A. Ruff  K.J. Behymer Infoscitex, Dayton, OH, USA e-mail: [email protected] K.J. Behymer e-mail: [email protected] E.M. Mersch Wright State Research Institute, Dayton, OH, USA e-mail: [email protected] © Springer International Publishing Switzerland 2017 P. Savage-Knepshield and J. Chen (eds.), Advances in Human Factors in Robots and Unmanned Systems, Advances in Intelligent Systems and Computing 499, DOI 10.1007/978-3-319-41959-6_10

113

114

G.L. Calhoun et al.

1 Introduction Agility in tactical decision-making, mission management, and control is the key attribute for enabling human and heterogeneous unmanned vehicle (UV) teams to successfully manage the “fog of war” with its inherent complex, ambiguous, and time-pressured conditions. In support of an Assistant Secretary of Defense for Research and Engineering (ASD(R and E)) Autonomy Research Project Initiative (ARPI), a tri-service team led by the Air Force Research Laboratory (AFRL) is developing and evaluating an “Intelligent Multi-UV Planner with Adaptive Collaborative/Control Technologies” (IMPACT) system. Additionally, AFRL leads development of a new interface paradigm by which the operator teams with autonomous technologies. This effort involves designing intuitive human-autonomy interfaces which will enable: (a) operators to monitor and instruct autonomy in response to dynamic environments and missions and (b) the autonomy to make suggestions to the operator and provides rationale for generated plans. Facilitating operator-autonomy interaction is a key challenge for achieving trusted, bi-directional collaboration. To guide the design of human-autonomy interfaces for multi-UV missions, defense mission scenarios for a military base were g