A3D: A Device for Studying Gaze in 3D
A wearable device for capturing 3D gaze information in indoor and outdoor environments is proposed. The hardware and software architecture of the device provides an estimate in quasi-real-time of 2.5D points of regard (POR) and then lift their estimations
- PDF / 17,414,295 Bytes
- 17 Pages / 439.37 x 666.142 pts Page_size
- 51 Downloads / 162 Views
Abstract. A wearable device for capturing 3D gaze information in indoor and outdoor environments is proposed. The hardware and software architecture of the device provides an estimate in quasi-real-time of 2.5D points of regard (POR) and then lift their estimations to 3D, by projecting them into the 3D reconstructed scene. The estimation procedure does not need any external device, and can be used both indoor and outdoor and with the subject wearing it moving, though some smooth constraint in the motion are required. To ensure a great flexibility with respect to depth a novel calibration method is proposed, which provides eye-scene calibration that explicitly takes into account depth information, in so ensuring a quite accurate estimation of the PORs. The experimental evaluation demonstrates that both 2.5D and 3D POR are accurately estimated. Keywords: Wearable device 3D scene
1
· 3D gaze estimation · Point of regard in
Introduction
Eye tracking has developed in the context of studying human visual selection mechanisms and attention (see [1–3] for a review on eye detection and gaze tracking in video-oculography). It is indeed well known that the points toward which humans direct the gaze are crucial for studying human perception and his ability to select the regions of interest out of a massive amount of visual information [4]. In the last few years the use of head-mounted eye tracking has spread in several research areas such as driving [5,6], learning [7], marketing [8], training [9], cultural heritage [10] and prevalently in human computer interfaces [11,12]; just to cite few of an increasing number of applications where gaze direction is studied. All these applications spotlight the need to move beyond prior models of computational attention and saliency [13–16] and move toward a deeper experimental analysis of gaze direction and eye-head motion, by collecting data to better understand the relation between point of regard (POR) and visual behavior [17], likewise strategies of search [18] and detection [19] in natural scenes. However only quite recently models for head-mounted eye tracking have been extended first to include head motion tracking [20,21] and further to 3D, so as to be employed in real life experiments in unstructured settings [22–25]. In [26] c Springer International Publishing Switzerland 2016 G. Hua and H. J´ egou (Eds.): ECCV 2016 Workshops, Part I, LNCS 9913, pp. 572–588, 2016. DOI: 10.1007/978-3-319-46604-0 41
A3D: A Device for Studying Gaze in 3D
Pupil Tracking
Eye- target Calibration
Disparity Map and Filtering
2.5D POR Estimation
POR Correction
3D POR projection
3D Scene Reconstruction
573
Fig. 1. On the left: a schema of the methods involved for projecting the PORs in the 3D reconstructed scene. On the right: the head-mounted eye-tracker with all its components.
Paletta and colleagues propose an interesting solution based on POR projection in an already reconstructed environment, exploiting an RGB-D sensor and the approach of [27]. To localize the PORs in the reconstructe
Data Loading...