skip to main content
10.1145/3225153.3225157acmconferencesArticle/Chapter ViewAbstractPublication PagessapConference Proceedingsconference-collections
research-article

A comparison of eye-head coordination between virtual and physical realities

Published:10 August 2018Publication History

ABSTRACT

Past research has shown that humans exhibit certain eye-head responses to the appearance of visual stimuli, and these natural reactions change during different activities. Our work builds upon these past observations by offering new insight to how humans behave in Virtual Reality (VR) compared to Physical Reality (PR). Using eye- and head- tracking technology, and by conducting a study on two groups of users - participants in VR or PR - we identify how often these natural responses are observed in both environments. We find that users statistically move their heads more often when viewing stimuli in VR than in PR, and VR users also move their heads more in the presence of text. We open a discussion for identifying the HWD factors that cause this difference, as this may not only affect predictive models using eye movements as features, but also VR user experience overall.

References

  1. {n. d.}. HTC Vive VR System. https://www.vive.com/us/product/vive-virtual-reality-system. ({n. d.}). Accessed: 2018-01-23.Google ScholarGoogle Scholar
  2. Emilio Bizzi. 1974. The coordination of eye-head movements. Scientific American (1974).Google ScholarGoogle Scholar
  3. Soussan Djamasbi, Marisa Siegel, and Tom Tullis. 2010. Generation Y, web design, and eye tracking. International Journal of Human-Computer Studies 68, 5 (2010), 307 -- 323. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Raymond Dodge. 1921. The Latent Time of Compensatory Eye-movements. Journal of Experimental Psychology 4, 4 (1921), 247.Google ScholarGoogle ScholarCross RefCross Ref
  5. Andrew T. Duchowski. 1998. Incorporating the viewer's point of regard (POR) in gaze-contingent virtual environments. (1998), 332--343 pages.Google ScholarGoogle Scholar
  6. Jonathan Gandrud and Victoria Interrante. 2016. Predicting destination using head orientation and gaze direction during locomotion in vr. In Proceedings of the ACM Symposium on Applied Perception. ACM, 31--38. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Henna Heikkilä. 2013. EyeSketch: A Drawing Application for Gaze Control. In Proceedings of the 2013 Conference on Eye Tracking South Africa (ETSA '13). ACM, New York, NY, USA, 71--74. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Anthony Hornof, Anna Cavender, and Rob Hoselton. 2004. Eyedraw: A System for Drawing Pictures with Eye Movements. In Proceedings of the 6th International ACM SIGACCESS Conference on Computers and Accessibility (Assets '04). ACM, New York, NY, USA, 86--93. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Anthony J. Hornof and Anna Cavender. 2005. EyeDraw: Enabling Children with Severe Motor Impairments to Draw with Their Eyes. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '05). ACM, New York, NY, USA, 161--170. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Poika Isokoski, Markus Joos, Oleg Spakov, and Benoît Martin. 2009. Gaze controlled games. Universal Access in the Information Society 8, 4 (2009), 323. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Rob Jacob and Sophie Stellmach. 2016. What You Look at is What You Get: Gaze-based User Interfaces. interactions 23, 5 (Aug. 2016), 62--65. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Robert J. K.Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM, New York, NY, USA, 11--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Çağla Çiğ Karaman and Tevfik Metin Sezgin. 2018. Gaze-based predictive user interfaces: Visualizing user intentions in the presence of uncertainty. International Journal of Human-Computer Studies 111 (2018), 78--91.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. John King, Julie C Rice, and Andrew Hurley. 2017. Consumer Appeal of Injection IML Packaging vs. Similarly Decorated Glass Jars, Composite Cans, and Metal Cans Using Eye Tracking Technology. Journal of Applied Packaging Research 9, 2 (2017), 1.Google ScholarGoogle Scholar
  15. James F Knight and Chris Baber. 2004. Neck muscle activity and perceived pain and discomfort due to variations of head load and posture. Aviation, space, and environmental medicine 75, 2 (2004), 123--131.Google ScholarGoogle Scholar
  16. Krzysztof Krejtz, Cezary Biele, Dominik Chrzastowski, Agata Kopacz, Anna Niedzielska, Piotr Toczyski, and Andrew Duchowski. 2014. Gaze-controlled Gaming: Immersive and Difficult but Not Cognitively Overloading. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1123--1129. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Michael Land. 1992. Predictable eye-head coordination during driving. Nature 359 (1992), 24.Google ScholarGoogle ScholarCross RefCross Ref
  18. Ronald R. Mourant and Charles G. Grimson. 1977. Predictive Head-Movements during Automobile Mirror-Sampling. Perceptual and Motor Skills 44, 1 (1977), 283--286.Google ScholarGoogle ScholarCross RefCross Ref
  19. B. Shackel. 1960. Note on Mobile Eye Viewpoint Recording. J. Opt. Soc. Am. 50, 8 (Aug 1960), 763--768.Google ScholarGoogle ScholarCross RefCross Ref
  20. Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '00). ACM, New York, NY, USA, 281--288. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Bruno Sergi Walter Di Nardo Gaetano Paludetti Fabrizio Ottaviani Stefano Di Girolamo, Pasqualina Picciotti. 2001. Vestibulo-Ocular Reflex Modification after Virtual Environment Exposure. Acta Oto-Laryngologica 121, 2 (2001), 211--215.Google ScholarGoogle ScholarCross RefCross Ref
  22. David J. Ward, Alan F. Blackwell, and David J. C. MacKay. 2000. Dasher---a Data Entry Interface Using Continuous Gestures and Language Models. In Proceedings of the 13th Annual ACM Symposium on User Interface Software and Technology (UIST '00). ACM, New York, NY, USA, 129--137. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. David J Ward and David JC MacKay. 2002. Fast hands-free writing by gaze direction. arXiv preprint cs/0204030 (2002).Google ScholarGoogle Scholar
  24. Matthias Wille, Lars Adolph, Britta Grauel, Sascha Wischniewski, Sabine Theis, and Thomas Alexander. 2014. Prolonged work with head mounted displays. In Proceedings of the 2014 ACM International Symposium on Wearable Computers: Adjunct Program. ACM, 221--224. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. AL Yarbus. 1967. Eye movements and vision. 1967. New York (1967).Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. A comparison of eye-head coordination between virtual and physical realities

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        SAP '18: Proceedings of the 15th ACM Symposium on Applied Perception
        August 2018
        162 pages
        ISBN:9781450358941
        DOI:10.1145/3225153

        Copyright © 2018 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 10 August 2018

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate43of94submissions,46%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader