A Framework for Outdoor Mobile Augmented Reality and Its Application to Mountain Peak Detection
Outdoor augmented reality applications project information of interest onto views of the world in real-time. Their core challenge is recognizing the meaningful objects present in the current view and retrieving and overlaying pertinent information onto su
- PDF / 2,601,438 Bytes
- 21 Pages / 439.37 x 666.142 pts Page_size
- 71 Downloads / 197 Views
Abstract. Outdoor augmented reality applications project information of interest onto views of the world in real-time. Their core challenge is recognizing the meaningful objects present in the current view and retrieving and overlaying pertinent information onto such objects. In this paper we report on the development of a framework for mobile outdoor augmented reality application, applied to the overlay of peak information onto views of mountain landscapes. The resulting app operates by estimating the virtual panorama visible from the viewpoint of the user, using an online Digital Terrain Model (DEM), and by matching such panorama to the actual image framed by the camera. When a good match is found, meta-data from the DEM (e.g., peak name, altitude, distance) are projected in real time onto the view. The application, besides providing a nice experience to the user, can be employed to crowdsource the collection of annotated mountain images for environmental applications. Keywords: Outdoor augmented reality · Mobile · Real-time · Mountain peak identification · Environment monitoring · Computer vision
1
Introduction
Outdoor augmented reality applications exploit the position and orientation sensors of mobile devices to estimate the location of the user and her field of view so as to overlay such view with information pertinent to the user’s inferred interest. These solutions are finding a promising application in the tourism sector, where they replace traditional map-based interfaces with a more sophisticated user experience whereby the user automatically receives information based on what he is looking at, without the need of manual search. Examples of such AR apps include, e.g., Metro AR and Lonely Planet’s Compass Guides1 . The main challenge of such applications is providing an accurate estimation of the user’s current interest, adapted in real-time to the changing view. Most commercial applications simplify the problem by estimating the user’s interest based only 1
http://www.lonelyplanet.com/guides.
c Springer International Publishing Switzerland 2016 L.T. De Paolis and A. Mongelli (Eds.): AVR 2016, Part I, LNCS 9768, pp. 281–301, 2016. DOI: 10.1007/978-3-319-40621-3 21
282
R. Fedorov et al.
on the information provided by the device position and orientation sensors, irrespective of the content actually in view. Examples are sky maps, which show the names of constellations, planets and stars based on the GPS position and compass signal. An obvious limit of these approaches is that they may provide information that does not match well what the user is seeing, due to errors in the position and orientation estimation or to the presence of objects partially occluding the view. These limitations, besides jeopardizing the user’s experience, prevent the possibility for the AR application to create augmented content. If the overlay of the meta-data onto the view is imprecise, it is not possible for the user to save a copy of the augmented view, e.g., in the form of an image with captions associated to the object
Data Loading...