Use Your Brain (and Light) for Innovative Human-Machine Interfaces

The human machine interface (HMI) system of a vehicle is made up of a number of input and output devices that work in harmony to allow the driver to access a number of features and functions. The HMI of motor vehicles has evolved slowly up until the late

  • PDF / 138,569 Bytes
  • 7 Pages / 439.37 x 666.142 pts Page_size
  • 86 Downloads / 172 Views

DOWNLOAD

REPORT


Abstract The human machine interface (HMI) system of a vehicle is made up of a number of input and output devices that work in harmony to allow the driver to access a number of features and functions. The HMI of motor vehicles has evolved slowly up until the late 1990s when the first automation systems (e.g., Cruise Control) and touch screens were introduced. Since then, the amount of technologies being introduced into the car, especially over the last few years has sky-rocketed. Of these technologies none can be considered more challenging than that of the move towards vehicle automation. Given this push, new ways of interaction with the vehicle will be needed. In this paper we present two new innovative HMI techniques that will be key to addressing this challenge, namely brain-computer interfaces (BCI’s) and ambient display technology aimed to make the driver (or passenger) interaction less demanding and more intuitive.





Keywords Human-machine interface Automotive Brain-computer interfaces Ambient displays User interfaces Autonomous vehicles Human factors









1 Introduction Since the late 1800s when motor vehicles were firstly introduced, human operators have interacted with these by means of pedals, steering wheel, buttons and dials. Since the late 1990s and in the last 5 years in particular, the automotive industry and driving research community have witnessed an exponential growth in the number of different technologies that, over time, have become part of the current Human-Machine Interface (HMI) of many vehicles on the market. The introduction F. Biondi (&)  L. Skrypchuk Jaguar Land Rover Limited, International Digital Laboratory, University of Warwick, University Road, Coventry, UK e-mail: [email protected] L. Skrypchuk e-mail: [email protected] © Springer International Publishing Switzerland 2017 I.L. Nunes (ed.), Advances in Human Factors and System Interactions, Advances in Intelligent Systems and Computing 497, DOI 10.1007/978-3-319-41956-5_10

99

100

F. Biondi and L. Skrypchuk

of touch screens for in-vehicle infotainment systems [1, 2], voice-recognition interfaces, Apple CarPlay© [3], Android Auto© [4] and gesture command systems [5] represent only a few of the large number of innovations that are part of the current in-vehicle driving experience. Autonomous, self-driving vehicles are fast approaching a reality. The National Highway Traffic Safety Administration, along with other international institutions in the transportation domain (e.g., Society of Automotive Engineers) have created taxonomies for autonomous vehicles [6, 7]. In this progression towards autonomy, many automotive Original Equipment Manufacturers (OEM) have made public their intention to release NHTSA level-2 autonomous vehicles within the next three years and, along with other consumer electronics companies, NHTSA level-4 fully-autonomous vehicles by 2025. There are even cases of NHTSA level-2 vehicles already available on the public market. In 2015, Tesla Motors© released its Tesla Autopilot© s