ABSTRACT
Past research has shown that humans exhibit certain eye-head responses to the appearance of visual stimuli, and these natural reactions change during different activities. Our work builds upon these past observations by offering new insight to how humans behave in Virtual Reality (VR) compared to Physical Reality (PR). Using eye- and head- tracking technology, and by conducting a study on two groups of users - participants in VR or PR - we identify how often these natural responses are observed in both environments. We find that users statistically move their heads more often when viewing stimuli in VR than in PR, and VR users also move their heads more in the presence of text. We open a discussion for identifying the HWD factors that cause this difference, as this may not only affect predictive models using eye movements as features, but also VR user experience overall.
- {n. d.}. HTC Vive VR System. https://www.vive.com/us/product/vive-virtual-reality-system. ({n. d.}). Accessed: 2018-01-23.Google Scholar
- Emilio Bizzi. 1974. The coordination of eye-head movements. Scientific American (1974).Google Scholar
- Soussan Djamasbi, Marisa Siegel, and Tom Tullis. 2010. Generation Y, web design, and eye tracking. International Journal of Human-Computer Studies 68, 5 (2010), 307 -- 323. Google ScholarDigital Library
- Raymond Dodge. 1921. The Latent Time of Compensatory Eye-movements. Journal of Experimental Psychology 4, 4 (1921), 247.Google ScholarCross Ref
- Andrew T. Duchowski. 1998. Incorporating the viewer's point of regard (POR) in gaze-contingent virtual environments. (1998), 332--343 pages.Google Scholar
- Jonathan Gandrud and Victoria Interrante. 2016. Predicting destination using head orientation and gaze direction during locomotion in vr. In Proceedings of the ACM Symposium on Applied Perception. ACM, 31--38. Google ScholarDigital Library
- Henna Heikkilä. 2013. EyeSketch: A Drawing Application for Gaze Control. In Proceedings of the 2013 Conference on Eye Tracking South Africa (ETSA '13). ACM, New York, NY, USA, 71--74. Google ScholarDigital Library
- Anthony Hornof, Anna Cavender, and Rob Hoselton. 2004. Eyedraw: A System for Drawing Pictures with Eye Movements. In Proceedings of the 6th International ACM SIGACCESS Conference on Computers and Accessibility (Assets '04). ACM, New York, NY, USA, 86--93. Google ScholarDigital Library
- Anthony J. Hornof and Anna Cavender. 2005. EyeDraw: Enabling Children with Severe Motor Impairments to Draw with Their Eyes. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '05). ACM, New York, NY, USA, 161--170. Google ScholarDigital Library
- Poika Isokoski, Markus Joos, Oleg Spakov, and Benoît Martin. 2009. Gaze controlled games. Universal Access in the Information Society 8, 4 (2009), 323. Google ScholarDigital Library
- Rob Jacob and Sophie Stellmach. 2016. What You Look at is What You Get: Gaze-based User Interfaces. interactions 23, 5 (Aug. 2016), 62--65. Google ScholarDigital Library
- Robert J. K.Jacob. 1990. What You Look at is What You Get: Eye Movement-based Interaction Techniques. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '90). ACM, New York, NY, USA, 11--18. Google ScholarDigital Library
- Çağla Çiğ Karaman and Tevfik Metin Sezgin. 2018. Gaze-based predictive user interfaces: Visualizing user intentions in the presence of uncertainty. International Journal of Human-Computer Studies 111 (2018), 78--91.Google ScholarDigital Library
- John King, Julie C Rice, and Andrew Hurley. 2017. Consumer Appeal of Injection IML Packaging vs. Similarly Decorated Glass Jars, Composite Cans, and Metal Cans Using Eye Tracking Technology. Journal of Applied Packaging Research 9, 2 (2017), 1.Google Scholar
- James F Knight and Chris Baber. 2004. Neck muscle activity and perceived pain and discomfort due to variations of head load and posture. Aviation, space, and environmental medicine 75, 2 (2004), 123--131.Google Scholar
- Krzysztof Krejtz, Cezary Biele, Dominik Chrzastowski, Agata Kopacz, Anna Niedzielska, Piotr Toczyski, and Andrew Duchowski. 2014. Gaze-controlled Gaming: Immersive and Difficult but Not Cognitively Overloading. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1123--1129. Google ScholarDigital Library
- Michael Land. 1992. Predictable eye-head coordination during driving. Nature 359 (1992), 24.Google ScholarCross Ref
- Ronald R. Mourant and Charles G. Grimson. 1977. Predictive Head-Movements during Automobile Mirror-Sampling. Perceptual and Motor Skills 44, 1 (1977), 283--286.Google ScholarCross Ref
- B. Shackel. 1960. Note on Mobile Eye Viewpoint Recording. J. Opt. Soc. Am. 50, 8 (Aug 1960), 763--768.Google ScholarCross Ref
- Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of Eye Gaze Interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '00). ACM, New York, NY, USA, 281--288. Google ScholarDigital Library
- Bruno Sergi Walter Di Nardo Gaetano Paludetti Fabrizio Ottaviani Stefano Di Girolamo, Pasqualina Picciotti. 2001. Vestibulo-Ocular Reflex Modification after Virtual Environment Exposure. Acta Oto-Laryngologica 121, 2 (2001), 211--215.Google ScholarCross Ref
- David J. Ward, Alan F. Blackwell, and David J. C. MacKay. 2000. Dasher---a Data Entry Interface Using Continuous Gestures and Language Models. In Proceedings of the 13th Annual ACM Symposium on User Interface Software and Technology (UIST '00). ACM, New York, NY, USA, 129--137. Google ScholarDigital Library
- David J Ward and David JC MacKay. 2002. Fast hands-free writing by gaze direction. arXiv preprint cs/0204030 (2002).Google Scholar
- Matthias Wille, Lars Adolph, Britta Grauel, Sascha Wischniewski, Sabine Theis, and Thomas Alexander. 2014. Prolonged work with head mounted displays. In Proceedings of the 2014 ACM International Symposium on Wearable Computers: Adjunct Program. ACM, 221--224. Google ScholarDigital Library
- AL Yarbus. 1967. Eye movements and vision. 1967. New York (1967).Google ScholarCross Ref
Index Terms
A comparison of eye-head coordination between virtual and physical realities
Recommendations
The Eye in Extended Reality: A Survey on Gaze Interaction and Eye Tracking in Head-worn Extended Reality
With innovations in the field of gaze and eye tracking, a new concentration of research in the area of gaze-tracked systems and user interfaces has formed in the field of Extended Reality (XR). Eye trackers are being used to explore novel forms of spatial ...
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality
Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article ...
Comments