Comprehensive review on brain-controlled mobile robots and robotic arms based on electroencephalography signals

  • PDF / 1,515,892 Bytes
  • 25 Pages / 595.276 x 790.866 pts Page_size
  • 5 Downloads / 115 Views

DOWNLOAD

REPORT


REVIEW ARTICLE

Comprehensive review on brain-controlled mobile robots and robotic arms based on electroencephalography signals Majid Aljalal1

· Sutrisno Ibrahim2 · Ridha Djemal1 · Wonsuk Ko1

Received: 26 October 2018 / Accepted: 27 May 2020 © Springer-Verlag GmbH Germany, part of Springer Nature 2020

Abstract There is a significant progress in the development of brain-controlled mobile robots and robotic arms in the recent years. New advances in electroencephalography (EEG) technology have led to the possibility of controlling external devices, such as robots, directly via the brain. The development of brain-controlled robotic devices has allowed people with bodily disabilities to enhance their mobility, individuality, and many types of activity. This paper provides a comprehensive review of EEG signal processing in robot control, including mobile robots and robotic arms, especially based on noninvasive brain computer interface systems. Various filtering approaches, feature extraction techniques, and machine learning algorithms for EEG classification are discussed and summarized. Finally, the conditions of the environments in which robots are used and robot types are also discussed. Keywords Brain–computer interface (BCI) · Brain-controlled robotic systems · EEG · ERD/ERS · Intelligent system · P300 · SSVEP

1 Introduction In addition to the use of robots in industry, their use in daily human life, especially as an assistant for people with disabilities, is increasing. Robots can be controlled by a healthy person with the help of an input device, such as a mouse and a keyboard. However, these input interfaces are not practical for people with body disabilities, such as multiple sclerosis (MS) or amyotrophic lateral sclerosis (ALS) patients. In most cases, these patients cannot walk or use their hands and arms, or even speak. Thus, these people cannot easily transmit their thoughts or required actions to robots using

B

Majid Aljalal [email protected] Sutrisno Ibrahim [email protected] Ridha Djemal [email protected] Wonsuk Ko [email protected]

1

Electrical Engineering Department, College of Engineering, King Saud University, Riyadh, Saudi Arabia

2

Electrical Engineering Department, College of Engineering, Sebelas Maret University, Surakarta, Indonesia

these conventional interfaces. The development of braincontrolled robots, which can be controlled directly from the brain, would be very useful in such cases. For this purpose, a brain–computer interface (BCI) system can provide alternative interaction between human brain and external devices such as a robot [1]. BCI systems in general can be classified into two types according to the method of capturing brain signal: invasive and noninvasive [2]. In invasive BCIs, brain signals are captured inside the brain (using electrodes located under the skull), whereas in noninvasive BCIs, signals are captured from locations outside the brain. The signals captured via an invasive BCI are stronger; however, this type requires surgery [3]. For this reason, n