skip to main content
10.1145/1477862.1477871acmconferencesArticle/Chapter ViewAbstractPublication PagessiggraphConference Proceedingsconference-collections
research-article

Vision-based 3D finger interactions for mixed reality games with physics simulation

Published:08 December 2008Publication History

ABSTRACT

Mixed reality applications can provide users with enhanced interaction experiences by integrating virtual and real world objects in a mixed environment. Through the mixed reality interface, a more realistic and immersive control style is achieved compared to the traditional keyboard and mouse input devices. The interface proposed in this paper consists of a stereo camera, which tracks the user's hands and fingers robustly and accurately in the 3D space. To enable a physically realistic experience in the interaction, a physics engine is adopted for the simulating the physics of virtual object manipulation. The objects can be picked up and tossed with physical characteristics, such as gravity and collisions which occur in the real world. Detection and interaction in our system is fully computer-vision based, without any markers or additional sensors. We demonstrate this gesture-based interface using two mixed reality game implementations: finger fishing, in which a player can simulate fishing for virtual objects with his/her fingers as in a real environment, and Jenga, which is a simulation of the well-known tower building game. A user study is conducted and reported to demonstrate the accuracy, effectiveness and comfort of using this interactive interface.

Skip Supplemental Material Section

Supplemental Material

a7-song.wmv

wmv

30.7 MB

References

  1. Audet, S., Bedrosian, M., Clement, C., and Dinculescu, M., 2006. MulTetris: A test of graspable user interfaces in collaborative games. Course project, McGill University, Canada.Google ScholarGoogle Scholar
  2. Billinghurst, M., Kato, H., and Poupyrev, I. 2001. The MagicBook: Moving seamlessly between reality and virtuality. IEEE Comput. Graph. Appl. 21, 3, 6--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Bullet. Bullet continuous collision detection and physics library. http://www.continuousphysics.com/Bullet/.Google ScholarGoogle Scholar
  4. Cheok, A. D., Goh, K. H., Liu, W., Farbiz, F., Fong, S. W., Teo, S. L., Li, Y., and Yang, X. 2004. Human pacman: a mobile, wide-area entertainment system based on physical, social, and ubiquitous computing. Personal Ubiquitous Comput. 8, 2, 71--81. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Crowley, J., Bérard, F., and Coutaz, J. 1995. Finger-tracking as an input device for augmented reality. In Proc. International Conference on Automatic Face and Gesture Recognition.Google ScholarGoogle Scholar
  6. Hardenberg, C., and Brard, F. 2001. Bare-hand human computer interaction. In Proc. Perceptual User Interfaces. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Havok. Havok physics. http://www.havok.com/.Google ScholarGoogle Scholar
  8. Ishii, H., and Ullmer, B. 1997. Tangible bits: Towards seamless interfaces between people, bits and atoms. In Proceedings of CHI'97 Conference on Human Factors in Computing systems, 234--241. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Laptev, I., and Lindeberg, T. 2000. Tracking of multi-state hand models using particle filtering and a hierarchy of multiscale image features. In Technical Report ISRN KTH/NA/P00/12-SE, The Royal Institute of Technology (KTH).Google ScholarGoogle Scholar
  10. Lee, S. Y., Kim, I. J., and Ahn, S. C., 2006. Real-time 3d video avatar for tangible interface. TSI workshop.Google ScholarGoogle Scholar
  11. Lien, C., and Huang, C. 1998. Model-based articulated hand motion tracking for gesture recognition. Image and Vision Computing 16, 2, 121--134.Google ScholarGoogle ScholarCross RefCross Ref
  12. Mikado Game. http://www.allwag.co.uk/detail_8617_0_0_Mikado-Game.aspx.Google ScholarGoogle Scholar
  13. Newton Physics Engine. A free win32 physics engine. http://www.physicsengine.com/.Google ScholarGoogle Scholar
  14. Nielsen, R., Delman, T. F., and Lossing, T. 2005. A mixed reality game for urban planning. In Proceedings of Computers in Urban Planning and Urban Management.Google ScholarGoogle Scholar
  15. ODE. Open dynamics engine. http://www.ode.org/.Google ScholarGoogle Scholar
  16. Rehg, J., and Kanade, T. 1993. Digiteyes: Vision-based 3D human hand tracking. In Technical Report CMU-CS-93-220. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Sato, Y., Kobayashi, Y., and Koike, H. 2000. Fast tracking of hands and fingertips in infrared images for augmented desk interface. In Proc. International Conference on Automatic Face and Gesture Recognition. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Segen, J. 1998. Gesture VR: Vision-based 3D hand interface for spatial interaction. In Proc. ACM Multimedia Conference. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Song, P., Winkler, S., Gilani, S., and Zhou, Z. 2007. Vision-based projected tabletop interface for finger interactions. In Human-Computer Interaction, IEEE International Workshop, HCI 2007, Proceedings, 49--58. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Thomas, B., Close, B., Donoghue, J., Squires, J., Bondi, P. D., and Piekarski, W. 2002. First person indoor/outdoor augmented reality application: Arquake. Personal and Ubiquitous Computing 6, 1, 75--86. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Tokamak Game Physics. http://www.tokamakphysics.com/.Google ScholarGoogle Scholar
  22. Triesch, J., and Malsburg, C. 1996. Robust classification of hand postures against complex background. In Proc. International Conference On Automatic Face and Gesture Recognition. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Uray, P., Kienzl, D. T., and Marsche, D. U. 2006. MRI: a mixed reality interface for the masses. In ACM SIGGRAPH Emerging technologies, 24. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Wanderley, I., Kelner, J., Costa, N., and Teichrieb, V. 2006. A survey of interaction in mixed reality systems. In Symposium on Virtual Reality, 1--4.Google ScholarGoogle Scholar
  25. Winkler, S., Yu, H., and Zhou, Z. Y. 2007. Tangible mixed reality desktop for digital media management. In SPIE The Engineering Reality of Virtual Reality, vol. 6490B.Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    VRCAI '08: Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry
    December 2008
    223 pages
    ISBN:9781605583358
    DOI:10.1145/1477862

    Copyright © 2008 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 8 December 2008

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article

    Acceptance Rates

    Overall Acceptance Rate51of107submissions,48%

    Upcoming Conference

    SIGGRAPH '24

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader