Reducing Visual Dependency with Surface Haptic Touchscreens
Interactions with current touchscreens are highly dependent on a pattern of visual feedback. Recently, researchers have developed surface haptic technology that provides haptic feedback on flat touchscreens. This presents an opportunity to add tactile res
- PDF / 1,138,688 Bytes
- 10 Pages / 439.37 x 666.142 pts Page_size
- 28 Downloads / 165 Views
tion
The iPhone’s success has resulted in the widespread application of touchscreens in many devices. Touchscreens are now a fundamental part of the tools tool that facilitates daily tasks, such as e-mail, navigation and web browsing. Compared to traditional input devices such as keyboards and mice, touchscreens provide a more direct way to interact with the digital world, co-locating touch input with graphic display. However, one drawback of touch interactions in most current devices is that they are still ‘two-dimensional’ – the interface feels like flat glass. In other words, visual output is responsive (active) while tactile output is passive, i.e. the screen produces no tactile feedback. In terms of usability, our interaction is still highly dependent on a pattern of visual feedback where a user first looks at the content displayed on the screen. They then move their finger to the target based on the visual cue. Finally, they confirm the target selection based on the subsequent animation. One potential drawback relying on visual feedback is that users will find it hard to complete tasks in eyes-busy environments such as when driving, running or in conditions of sun glare. Recently, researchers have developed surface haptic displays which introduce electroadhesion to produce adhesive friction force on the finger [12]. By dynamically adjusting the adhesive force, the touchscreen can provide the feeling of different surface texture in response to finger movements. In this paper, we examine c Springer International Publishing Switzerland 2016 F. Bello et al. (Eds.): EuroHaptics 2016, Part II, LNCS 9775, pp. 263–272, 2016. DOI: 10.1007/978-3-319-42324-1 26
264
Y.-J. Lin and S. O’Modhrain
how adding this surface haptic feedback to a bullseye menu can improve the performance of touchscreen interaction when visual feedback is reduced or unavailable. Our results show that, in such conditions, haptic feedback can improve the accuracy and task completion time for interaction with touchscreens.
2 2.1
Background The Bullseye Menu for Non-Visual Interaction
A bullseye menu is essentially a pie menu modified to permit more selection from a large number of options (Fig. 1). It is a series of concentric circles divided into sectors. The region between two concentric circles is called a ring. Each sector of a ring represents a different menu item. The user touches the touchscreen to trigger the menu, and the center of the menu will align to the user’s finger or pointer to make sure the starting point is always the center of the menu. The direction and distance in which a user moves a pointing device (in our case, their finger) from a floating origin determines the selected menu item. For example, if a user operates the menu of a music player, they could select the upward sector of the inner ring to pause, and select the rightward sector of the outer ring to shuffle (Fig. 1). Previous studies suggest that a bullseye menu might serve as a good non-visual menu paradigm for two reasons [3]. First, the menu item is always drawn relative t
Data Loading...