Abstract
Indirect augmented reality (IAR) employs a unique approach to achieve high-quality synthesis of the real world and the virtual world, unlike traditional augmented reality (AR), which superimposes virtual objects in real time. IAR uses pre-captured omnidirectional images and offline superimposition of virtual objects for achieving jitter- and drift-free geometric registration as well as high-quality photometric registration. However, one drawback of IAR is the inconsistency between the real world and the pre-captured image. In this paper, we present a new classification of IAR inconsistencies and analyze the effect of these inconsistencies on the IAR experience. Accordingly, we propose a novel IAR system that reflects real-world illumination changes by selecting an appropriate image from among multiple pre-captured images obtained under various illumination conditions. The results of experiments conducted at an actual historical site show that the consideration of real-world illumination changes improves the realism of the IAR experience.
Similar content being viewed by others
Notes
These models were created with the cooperation of the Graduate School of Media Design, Keio University.
The English translation of the earlier paper [1] did not preserve the meanings of the original questions (which were written in Japanese) accurately; therefore, we have modified the English sentences according to the original meaning.
These models were created with reference to the miniature model of Todaiji with architectural validation by Yumiko Fukuda, Hiroshima Institute of Technology, and patina expression technology validation by Takeaki Nakajima, Hiroshima City University.
References
Akaguma T, Okura F, Sato T, Yokoya N (2013) Mobile AR using pre-captured omnidirectional images. In: Proc. ACM SIGGRAPH Asia’13 Symp. on Mobile Graphics and Interactive Applications, p 26:1–26:4
Anguelov D, Dulong C, Filip D, Frueh C, Lafon S, Lyon R, Ogale A, Vincent L, Weaver J (2010) Google street view: Capturing the world at street level. IEEE Computer Magazine 43(6):32–38
Arai I, Hori M, Kawai N, Abe Y, Ichikawa M, Satonaka Y, Nitta T, Nitta T, Fujii H, Mukai M, Hiromi S, Makita K, Kanbara M, Nishio N, Yokoya N (2010) Pano UMECHIKA: A crowded underground city panoramic view system. In: Proc. Int’l Symp. on Distributed Computing and Artificial Intelligence (DCAI’10), p 173–180
Aydin T, Smolic A, Gross M (2015) Automated aesthetic analysis of photographic images. IEEE Trans Vis Comput Graph 21(1):31–42
Azuma R (1997) A survey of augmented reality. Presence: Teleoperators and Virtual Environments 6(4):355–385
Chaurasia G, Duchene S, Sorkine-Hornung O, Drettakis G (2013) Depth synthesis and local warps for plausible image-based navigation. ACM Trans Graph 12 (3):30:1–30
Chen SE (1995) Quicktime VR: An image-based approach to virtual environment navigation. In: Proc. ACM SIGGRAPH’95, p 29–38
Côté S, Trudel P, Desbiens M, Giguère M, Snyder R (2013) Live mobile panoramic high accuracy augmented reality for engineering and construction. In: Proc. 13th Int’l Conf. on Construction Applications of Virtual Reality (CONVR’13), p 262–271
Debevec P (1998) Rendering synthetic objects into real scenes: Bridging traditional and image-based graphics with global illumination and high dynamic range photography. In: Proc. ACM SIGGRAPH’98, p 189–198
Grosch T (2005) PanoAR: Interactive augmentation of omnidirectional images with consistent lighting. In: Proc. Computer Vision/Computer Graphics Collaboration Techniques and Applications (Mirage’05), p 25–34
Gruber L, Kalkofen D, Schmalstieg D (2010) Color harmonization for augmented reality. In: Proc. 9th IEEE Int’l Symp. on Mixed and Augmented Reality (ISMAR’10), pp 227–228
Gruber L, Richter-Trummer T, Schmalstieg D (2012) Real-time photometric registration from arbitrary geometry. In: Proc. 11th IEEE Int’l Symp. on Mixed and Augmented Reality (ISMAR’12), pp 119–128
Hollerer T, Feiner S, Pavlik J (1999) Situated documentaries: Embedding multimedia presentations in the real world. In: Proc. 3rd IEEE Int’l Symp. on Wearable Computers (ISWC’99), pp 79–86
Kán P, Kaufmann H (2012) High-quality reflections, refractions, and caustics in augmented reality and their contribution to visual coherence. In: Proc. 11th IEEE Int’l Symp. on Mixed and Augmented Reality (ISMAR’12), pp 99–108
Kanbara M, Yokoya N (2002) Geometric and photometric registration for real-time augmented reality. In: Proc. First Int’l Symp. on Mixed and Augmented Reality (ISMAR’02), pp 279–280
Kawai N, Sato T, Yokoya N (2009) Image inpainting considering brightness change and spatial locality of textures and its evaluation. In: Proc. Third Pacific-Rim Symp. on Image and Video Technology (PSIVT’09), pp 271–282
Klein G, Murray D (2009) Parallel tracking and mapping on a camera phone. In: Proc. 8th IEEE Int’l Symp. on Mixed and Augmented Reality (ISMAR’09), pp 83–86
Laffont P Y, Bousseau A, Drettakis G (2013) Rich intrinsic image decomposition of outdoor scenes from multiple views. IEEE Trans Vis Comput Graph 19(2):210–224
Langlotz T, Degendorfer C, Mulloni A, Schall G, Reitmayr G, Schmalstieg D (2011) Robust detection and tracking of annotations for outdoor augmented reality browsing. Computers & Graphics 35(4):831–840
Lensing P, Broll W (2012) Instant indirect illumination for dynamic mixed reality scenes. In: Proc. 11th IEEE Int’l Symp. on Mixed and Augmented Reality (ISMAR’12), pp 109–118
Liestol G, Morrison A (2013) Views, alignment and incongruity in indirect augmented reality. In: Proc. 12th IEEE Int’l Symp. on Mixed and Augmented Reality (ISMAR’13)—Arts, Media, and Humanities, pp 23–28
Madsen JB, Stenholt R (2014) How wrong can you be: Perception of static orientation errors in mixed reality. In: Proc. IEEE Symp. on 3D User Interfaces (3DUI’14), pp 83–90
Mori M, MacDorman K, Kageki N (2012) The uncanny valley [from the field]. IEEE Robot Autom Mag 19(2):98–100
Okura F, Akaguma T, Sato T, Yokoya N (2014) Indirect augmented reality considering real-world illumination change. In: Proc. IEEE Int’l Symp. on Mixed and Augmented Reality (ISMAR’14), pp 287–288
Okura F, Kanbara M, Yokoya N (2015) Mixed-reality world exploration using image-based rendering. ACM J Comput Cult Herit 26(2):9:1–9
Schops T, Enge J, Cremers D (2014) Semi-dense visual odometry for AR on a smartphone. In: Proc. 2014 IEEE Int’l Symp. on Mixed and Augmented Reality (ISMAR’14), pp 145–150
Tenmoku R, Kanbara M, Yokoya N (2003) A wearable augmented reality system using positioning infrastructures and a pedometer. In: Proc. 16th IEEE Int’l Symp. on Wearable Computers (ISWC’03), pp 110–117
Uyttendaele M, Criminisi A, Kang S B, Winder S, Szeliski R, Hartley R (2004) Image-based interactive exploration of real-world environments. EEE Comput Graph Appl 24(3):52–63
Ventura J, Hollerer T (2012) Wide-area scene mapping for mobile visual tracking. In: Proc. 11th IEEE Int’l Symp. on Mixed and Augmented Reality (ISMAR’12), pp 3–12
Waegel K (2014) A reconstructive see-through display. In: Proc. IEEE Int’l Symp. on Mixed and Augmented Reality (ISMAR’14), pp 379–320
Wither J, Tsai Y T, Azuma R (2011) Indirect augmented reality. Comput Graph 35(4):810–822
Yamamoto G, Lübke AIW, Taketomi T, Kato H (2014) A see-through vision with handheld augmented reality for sightseeing. In: Proc. 16th Int’l Conf. on Human-Computer Interaction (HCI Interational’14), pp 392–399
Zhou F, Duh HBL, Billinghurst M (2008) Trends in augmented reality tracking, interaction and display: A review of ten years of ISMAR. In: Proc. 7th IEEE/ACM Int’l Symp. on Mixed and Augmented Reality (ISMAR’08), pp 193–202
Zoellner M, Keil J, Drevensek T, Wuest H (2009) Cultural heritage layers: Integrating historic media in augmented reality. In: Proc. 15th Int’l Conf. on Virtual Systems and Multimedia (VSMM’09), pp 193–196
Acknowledgments
We first thank the anonymous reviewers for their constructive comments and suggestions. We thank members of NAIST-Keio joint research group. We also thank Todaiji for giving us the opportunity to conduct public experiments.
Author information
Authors and Affiliations
Corresponding author
Additional information
This research was partially supported by JSPS KAKENHI 23240024, 26330193, 15H06362, 25-7448, and by the NAIST Advanced Research Partnership Project.
Rights and permissions
About this article
Cite this article
Okura, F., Akaguma, T., Sato, T. et al. Addressing temporal inconsistency in indirect augmented reality. Multimed Tools Appl 76, 2671–2695 (2017). https://doi.org/10.1007/s11042-015-3222-0
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-015-3222-0