Plug-and-play supervisory control using muscle and brain signals for real-time gesture and error detection

  • PDF / 4,183,298 Bytes
  • 20 Pages / 595.276 x 790.866 pts Page_size
  • 39 Downloads / 127 Views

DOWNLOAD

REPORT


Plug-and-play supervisory control using muscle and brain signals for real-time gesture and error detection Joseph DelPreto1 Daniela Rus1

· Andres F. Salazar-Gomez1,2 · Stephanie Gil1,3 · Ramin Hasani1,4 · Frank H. Guenther5 ·

Received: 5 December 2018 / Accepted: 9 June 2020 © The Author(s) 2020

Abstract Effective human supervision of robots can be key for ensuring correct robot operation in a variety of potentially safety-critical scenarios. This paper takes a step towards fast and reliable human intervention in supervisory control tasks by combining two streams of human biosignals: muscle and brain activity acquired via EMG and EEG, respectively. It presents continuous classification of left and right hand-gestures using muscle signals, time-locked classification of error-related potentials using brain signals (unconsciously produced when observing an error), and a framework that combines these pipelines to detect and correct robot mistakes during multiple-choice tasks. The resulting hybrid system is evaluated in a “plug-and-play” fashion with 7 untrained subjects supervising an autonomous robot performing a target selection task. Offline analysis further explores the EMG classification performance, and investigates methods to select subsets of training data that may facilitate generalizable plug-and-play classifiers. Keywords Human–robot interaction · EMG control · EEG control · Hybrid control · Gesture detection · Error-related potentials · Plug-and-play supervisory control

1 Introduction As robots become more prevalent in homes, factories, and other safety-critical settings, detecting and correcting robot errors becomes increasingly important. A fast, reliable, and intuitive framework for supervising robots could help avoid This is one of the several papers published in Autonomous Robots comprising the Special Issue on Robotics: Science and Systems.

errors that would otherwise lead to costly hardware damage or safety risks. If a robot could be taught to detect nonverbal cues such as distress signals and hand gestures as reliably as a collaborating human partner, then interactions with robots would become more efficient and supervision or collaboration would become more effective. Using biosignals such as muscle or brain activity via electromyography (EMG) or electroencephalography (EEG),

This work was funded in part by the Boeing Company, for which the authors express gratitude.

Frank H. Guenther [email protected]

Electronic supplementary material The online version of this article (https://doi.org/10.1007/s10514-020-09916-x) contains supplementary material, which is available to authorized users.

B

Daniela Rus [email protected] 1

Massachusetts Institute of Technology, Distributed Robotics Lab, Cambridge, MA 02139, USA

2

Andres F. Salazar-Gomez [email protected]

Present Address: Massachusetts Institute of Technology, Open Learning, Cambridge, MA 02139, USA

3

Stephanie Gil [email protected]

Present Address: Harvard University, REACT Lab, Cambridge, MA 02138, USA

4

Ramin Hasani ramin.hasani@tu