ABSTRACT
Mixed reality applications can provide users with enhanced interaction experiences by integrating virtual and real world objects in a mixed environment. Through the mixed reality interface, a more realistic and immersive control style is achieved compared to the traditional keyboard and mouse input devices. The interface proposed in this paper consists of a stereo camera, which tracks the user's hands and fingers robustly and accurately in the 3D space. To enable a physically realistic experience in the interaction, a physics engine is adopted for the simulating the physics of virtual object manipulation. The objects can be picked up and tossed with physical characteristics, such as gravity and collisions which occur in the real world. Detection and interaction in our system is fully computer-vision based, without any markers or additional sensors. We demonstrate this gesture-based interface using two mixed reality game implementations: finger fishing, in which a player can simulate fishing for virtual objects with his/her fingers as in a real environment, and Jenga, which is a simulation of the well-known tower building game. A user study is conducted and reported to demonstrate the accuracy, effectiveness and comfort of using this interactive interface.
Supplemental Material
- Audet, S., Bedrosian, M., Clement, C., and Dinculescu, M., 2006. MulTetris: A test of graspable user interfaces in collaborative games. Course project, McGill University, Canada.Google Scholar
- Billinghurst, M., Kato, H., and Poupyrev, I. 2001. The MagicBook: Moving seamlessly between reality and virtuality. IEEE Comput. Graph. Appl. 21, 3, 6--8. Google ScholarDigital Library
- Bullet. Bullet continuous collision detection and physics library. http://www.continuousphysics.com/Bullet/.Google Scholar
- Cheok, A. D., Goh, K. H., Liu, W., Farbiz, F., Fong, S. W., Teo, S. L., Li, Y., and Yang, X. 2004. Human pacman: a mobile, wide-area entertainment system based on physical, social, and ubiquitous computing. Personal Ubiquitous Comput. 8, 2, 71--81. Google ScholarDigital Library
- Crowley, J., Bérard, F., and Coutaz, J. 1995. Finger-tracking as an input device for augmented reality. In Proc. International Conference on Automatic Face and Gesture Recognition.Google Scholar
- Hardenberg, C., and Brard, F. 2001. Bare-hand human computer interaction. In Proc. Perceptual User Interfaces. Google ScholarDigital Library
- Havok. Havok physics. http://www.havok.com/.Google Scholar
- Ishii, H., and Ullmer, B. 1997. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proceedings of CHI'97 Conference on Human Factors in Computing systems, 234--241. Google ScholarDigital Library
- Laptev, I., and Lindeberg, T. 2000. Tracking of multi-state hand models using particle filtering and a hierarchy of multiscale image features. In Technical Report ISRN KTH/NA/P00/12-SE, The Royal Institute of Technology (KTH).Google Scholar
- Lee, S. Y., Kim, I. J., and Ahn, S. C., 2006. Real-time 3d video avatar for tangible interface. TSI workshop.Google Scholar
- Lien, C., and Huang, C. 1998. Model-based articulated hand motion tracking for gesture recognition. Image and Vision Computing 16, 2, 121--134.Google ScholarCross Ref
- Mikado Game. http://www.allwag.co.uk/detail_8617_0_0_Mikado-Game.aspx.Google Scholar
- Newton Physics Engine. A free win32 physics engine. http://www.physicsengine.com/.Google Scholar
- Nielsen, R., Delman, T. F., and Lossing, T. 2005. A mixed reality game for urban planning. In Proceedings of Computers in Urban Planning and Urban Management.Google Scholar
- ODE. Open dynamics engine. http://www.ode.org/.Google Scholar
- Rehg, J., and Kanade, T. 1993. Digiteyes: Vision-based 3D human hand tracking. In Technical Report CMU-CS-93-220. Google ScholarDigital Library
- Sato, Y., Kobayashi, Y., and Koike, H. 2000. Fast tracking of hands and fingertips in infrared images for augmented desk interface. In Proc. International Conference on Automatic Face and Gesture Recognition. Google ScholarDigital Library
- Segen, J. 1998. Gesture VR: Vision-based 3D hand interface for spatial interaction. In Proc. ACM Multimedia Conference. Google ScholarDigital Library
- Song, P., Winkler, S., Gilani, S., and Zhou, Z. 2007. Vision-based projected tabletop interface for finger interactions. In Human-Computer Interaction, IEEE International Workshop, HCI 2007, Proceedings, 49--58. Google ScholarDigital Library
- Thomas, B., Close, B., Donoghue, J., Squires, J., Bondi, P. D., and Piekarski, W. 2002. First person indoor/outdoor augmented reality application: Arquake. Personal and Ubiquitous Computing 6, 1, 75--86. Google ScholarDigital Library
- Tokamak Game Physics. http://www.tokamakphysics.com/.Google Scholar
- Triesch, J., and Malsburg, C. 1996. Robust classification of hand postures against complex background. In Proc. International Conference On Automatic Face and Gesture Recognition. Google ScholarDigital Library
- Uray, P., Kienzl, D. T., and Marsche, D. U. 2006. MRI: a mixed reality interface for the masses. In ACM SIGGRAPH Emerging technologies, 24. Google ScholarDigital Library
- Wanderley, I., Kelner, J., Costa, N., and Teichrieb, V. 2006. A survey of interaction in mixed reality systems. In Symposium on Virtual Reality, 1--4.Google Scholar
- Winkler, S., Yu, H., and Zhou, Z. Y. 2007. Tangible mixed reality desktop for digital media management. In SPIE The Engineering Reality of Virtual Reality, vol. 6490B.Google Scholar
Recommendations
Mixed Reality MIDI Keyboard Demonstration
AM '17: Proceedings of the 12th International Audio Mostly Conference on Augmented and Participatory Sound and Music ExperiencesThe Mixed Reality MIDI Keyboard is a prototype designed to augment virtual reality experiences through the inclusion of a physical interface which aligns the user's senses with the virtual environment. It also serves as a platform on which the uses of ...
Vision and Graphics in Producing Mixed Reality Worlds
CVVRHC '98: Proceedings of the 1998 Workshop on Computer Vision for Virtual Reality Based Human Communications (CVVRHC '98)This paper introduces prominent topics of our Mixed Reality (MR) project and states what kind of role computer vision and graphics play in the project. MR is a part of VR in broader sense, but it treats the physical space as well as the virtual space ...
Real-time 3D video avatar in mixed reality: an implementation for immersive telecommunication
Symposium: virtual reality simulationThis article presents an implementation of a real-time dynamic 3D avatar from multiview cameras for immersive telecommunication. Immersive telecommunication is a new challenging field that enables a user to share a virtual space with remote participants,...
Comments