Résumé

Location-based augmented reality (AR) enables in situ visualization of geolocated media in the real world, usually based on latitude and longitude referencing only. By integrating 6DoF georeferenced photographs, they can be positioned along all three rotational axes (pitch, yaw, roll) in addition to the usual perpendicular translation axes (surge, heave, sway), resulting in data registered on all six mechanical degrees of freedom of movement of a rigid body in three-dimensional space. In this paper, we present an integration test of 6DoF georeferenced data from the Smapshot open API to the BiodivAR open authoring tool for location-based AR. While both projects are open source, their data is not interoperable, similarly to most citizen science (CS) projects. After mapping the data and importing it in a new augmented environment, it can be visualized in the mobile AR interface, at the locations specified by the georeferenced images. Visualizing historical photographs in the AR interface from the actual location where they were originally taken allows them to align with their original context, resulting in an eye-catching and impactful visualization. This contextual visualization of historical photographs informs and enriches their meaning, and allows viewers to quickly detect patterns and anomalies at a glance. We perform a basic qualitative evaluation of the visualization made possible by the combination of 6DoF georeferenced data and our authoring system, based on our own observations. We discuss the potential of this proof-of-concept to foster participation and understanding in citizen science and education, especially with regards to biodiversity monitoring and education.

Détails

Actions