skip to main content
10.1145/3290605.3300842acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Resolving Target Ambiguity in 3D Gaze Interaction through VOR Depth Estimation

Published:02 May 2019Publication History

ABSTRACT

Target disambiguation is a common problem in gaze interfaces, as eye tracking has accuracy and precision limitations. In 3D environments this is compounded by objects overlapping in the field of view, as a result of their positioning at different depth with partial occlusion. We introduce VOR depth estimation, a method based on the Vestibulo-ocular reflex of the eyes in compensation of head movement, and explore its application to resolve target ambiguity. The method estimates gaze depth by comparing the rotations of the eye and the head when the users look at a target and deliberately rotate their head. We show that VOR eye movement presents an alternative to vergence for gaze depth estimation, that is feasible also with monocular tracking. In an evaluation of its use for target disambiguation, our method outperforms vergence for targets presented at greater depth.

Skip Supplemental Material Section

Supplemental Material

pn4657.mp4

mp4

21.8 MB

References

  1. Florian Alt, Stefan Schneegass, Jonas Auda, Rufat Rzayev, and Nora Broy. 2014. Using Eye-tracking to Support Interaction with Layered 3D Interfaces on Stereoscopic Displays. In Proceedings of the 19th International Conference on Intelligent User Interfaces (IUI '14). ACM, New York, NY, USA, 267--272. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Florian Alt, Stefan Schneegass, Jonas Auda, Rufat Rzayev, and Nora Broy. 2014. Using Eye-tracking to Support Interaction with Layered 3D Interfaces on Stereoscopic Displays. In Proceedings of the 19th International Conference on Intelligent User Interfaces (IUI '14). ACM, New York, NY, USA, 267--272. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Dora E Angelaki. 2004. Eyes on target: what neurons must do for the vestibuloocular reflex during linear motion. Journal of neurophysiology 92, 1 (2004), 20--35.Google ScholarGoogle ScholarCross RefCross Ref
  4. F. Argelaguet and C. Andujar. 2009. Efficient 3D Pointing Selection in Cluttered Virtual Environments. IEEE Computer Graphics and Applications 29, 6 (Nov 2009), 34--43.Google ScholarGoogle ScholarCross RefCross Ref
  5. Ferran Argelaguet, Carlos Andujar, and Ramon Trueba. 2008. Overcoming Eye-hand Visibility Mismatch in 3D Pointing Selection. In Proceedings of the 2008 ACM Symposium on Virtual Reality Software and Technology (VRST '08). ACM, New York, NY, USA, 43--46. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Richard Bates and Howell Istance. 2002. Zooming interfaces!: enhancing the performance of eye controlled pointing devices. In Proceedings of the fifth international ACM conference on Assistive technologies. ACM, 119--126. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Brian C. Daugherty, Andrew T. Duchowski, Donald H. House, and Celambarasan Ramasamy. 2010. Measuring Vergence over Stereoscopic Video with a Remote Eye Tracker. In Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications (ETRA '10). ACM, New York, NY, USA, 97--100. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. S. Deng, J. Chang, S. Hu, and J. J. Zhang. 2017. Gaze Modulated Disambiguation Technique for Gesture Control in 3D Virtual Objects Selection. In 2017 3rd IEEE International Conference on Cybernetics (CYBCONF). 1--8.Google ScholarGoogle Scholar
  9. Andrew T. Duchowski, Donald H. House, Jordan Gestring, Robert Congdon, Lech Swirski, Neil A. Dodgson, Krzysztof Krejtz, and Izabela Krejtz. 2014. Comparing Estimated Gaze Depth in Virtual and Physical Environments. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 103--110. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Andrew T. Duchowski, Eric Medlin, Anand Gramopadhye, Brian Melloy, and Santosh Nair. 2001. Binocular Eye Tracking in VR for Visual Inspection Training. In Proceedings of the ACM Symposium on Virtual Reality Software and Technology (VRST '01). ACM, New York, NY, USA, 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Andrew T. Duchowski, Brandon Pelfrey, Donald H. House, and Rui Wang. 2011. Measuring Gaze Depth with an Eye Tracker During Stereoscopic Display. In Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization (APGV '11). ACM, New York, NY, USA, 15--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. N. Elmqvist and P. Tsigas. 2008. A Taxonomy of 3D Occlusion Management for Visualization. IEEE Transactions on Visualization and Computer Graphics 14, 5 (Sept 2008), 1095--1109. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. David Fono and Roel Vertegaal. 2005. EyeWindows: evaluation of eye-controlled zooming windows for focus selection. In Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, 151--160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Peter A Gorry. 1990. General least-squares smoothing and differentiation by the convolution (Savitzky-Golay) method. Analytical Chemistry 62, 6 (1990), 570--573.Google ScholarGoogle ScholarCross RefCross Ref
  15. Esteban Gutierrez Mlot, Hamed Bahmani, Siegfried Wahl, and Enkelejda Kasneci. 2016. 3D Gaze Estimation Using Eye Vergence. In Proceedings of the International Joint Conference on Biomedical Engineering Systems and Technologies (BIOSTEC 2016). SCITEPRESS - Science and Technology Publications, Lda, Portugal, 125--131. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Dan Witzner Hansen, Henrik HT Skovsgaard, John Paulin Hansen, and Emilie Møllenbach. 2008. Noise tolerant selection by gaze-controlled pan and zoom in 3D. In Proceedings of the 2008 symposium on Eye tracking research & applications. ACM, 205--212. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. John Paulin Hansen, Vijay Rajanna, I. Scott MacKenzie, and Per Bækgaard. 2018. A Fitts' Law Study of Click and Dwell Interaction by Gaze, Head and Mouse with a Head-mounted Display. In Proceedings of the Workshop on Communication by Gaze Interaction (COGAIN '18). ACM, New York, NY, USA, Article 7, 5 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. C. Hennessey* and P. Lawrence. 2009. Noncontact Binocular Eye-Gaze Tracking for Point-of-Gaze Estimation in Three Dimensions. IEEE Transactions on Biomedical Engineering 56, 3 (March 2009), 790--799.Google ScholarGoogle Scholar
  19. https://www.tobiipro.com/product-listing/vr integration/. 2018 (accessed Aug 1, 2018). .Google ScholarGoogle Scholar
  20. Robert JK Jacob. 1991. The use of eye movements in human-computer interaction techniques: what you look at is what you get. ACM Transactions on Information Systems (TOIS) 9, 2 (1991), 152--169. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Shahram Jalaliniya, Diako Mardanbegi, and Thomas Pederson. 2015. MAGIC Pointing for Eyewear Computers. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC '15). ACM, New York, NY, USA, 155--158. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. J. Ki and Y. Kwon. 2008. 3D Gaze Estimation and Interaction. In 2008 3DTV Conference: The True Vision - Capture, Transmission and Display of 3D Video. 373--376.Google ScholarGoogle Scholar
  23. Konstantin Klamka, Andreas Siegel, Stefan Vogt, Fabian Göbel, Sophie Stellmach, and Raimund Dachselt. 2015. Look & Pedal: Hands-free Navigation in Zoomable Information Spaces Through Gaze-supported Foot Input. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction (ICMI '15). ACM, New York, NY, USA, 123-- 130. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. R. Kopper, F. Bacim, and D. A. Bowman. 2011. Rapid and accurate 3D selection by progressive refinement. In 2011 IEEE Symposium on 3D User Interfaces (3DUI). 67--74. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Yong-Moo Kwon, Kyeong-Won Jeon, Jeongseok Ki, Qonita M Shahab, Sangwoo Jo, and Sung-Kyu Kim. 2006. 3D Gaze Estimation and Interaction to Stereo Dispaly. IJVR 5, 3 (2006), 41--45.Google ScholarGoogle ScholarCross RefCross Ref
  26. Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, and Mark Billinghurst. 2018. Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI '18). ACM, New York, NY, USA, Article 81, 14 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Radoslaw Mantiuk, Bartosz Bazyluk, and Anna Tomaszewska. 2011. Gaze-Dependent Depth-of-Field Effect Rendering in Virtual Environments. In Serious Games Development and Applications, Minhua Ma, Manuel Fradinho Oliveira, and João Madeiras Pereira (Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 1--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based Head Gestures. In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '12). ACM, New York, NY, USA, 139--146. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Olivier Mercier, Yusufu Sulai, Kevin Mackenzie, Marina Zannoli, James Hillis, Derek Nowrouzezahrai, and Douglas Lanman. 2017. Fast Gazecontingent Optimal Decompositions for Multifocal Displays. ACM Trans. Graph. 36, 6, Article 237 (Nov. 2017), 15 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Susan M. Munn and Jeff B. Pelz. 2008. 3D Point-of-regard, Position and Head Orientation from a Portable Monocular Video-based Eye Tracker. In Proceedings of the 2008 Symposium on Eye Tracking Research & Applications (ETRA '08). ACM, New York, NY, USA, 181--188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Tomi Nukarinen, Jari Kangas, Oleg Spakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Evaluation of HeadTurn: An Interaction Technique Using the Gaze and Head Turns. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI '16). ACM, New York, NY, USA, Article 43, 8 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Jason Orlosky, Takumi Toyama, Daniel Sonntag, and Kiyoshi Kiyokawa. 2016. The Role of Focus in Advanced Visual Interfaces. KI - Künstliche Intelligenz 30, 3 (01 Oct 2016), 301--310.Google ScholarGoogle Scholar
  33. Thies Pfeiffer, Marc Erich Latoschik, and Ipke Wachsmuth. 2008. Evaluation of binocular eye trackers and algorithms for 3D gaze interaction in virtual reality environments. JVRB-Journal of Virtual Reality and Broadcasting 5, 16 (2008).Google ScholarGoogle Scholar
  34. Vijay Rajanna and John Paulin Hansen. 2018. Gaze typing in virtual reality: impact of keyboard design, selection method, and motion. In Proceedings of the Tenth Biennial ACM Symposium on Eye Tracking Research and Applications (ETRA'18). Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Stephan Reichelt, Ralf Häussler, Gerald Fütterer, and Norbert Leister. 2010. Depth cues in human visual perception and their realization in 3D displays. In Three-Dimensional Imaging, Visualization, and Display 2010 and Display Technologies and Applications for Defense, Security, and Avionics IV, Vol. 7690. International Society for Optics and Photonics, 76900B.Google ScholarGoogle Scholar
  36. Oleg Spakov. 2011. Comparison of gaze-to-objects mapping algorithms. In Proceedings of the 1st Conference on Novel Gaze-Controlled Applications. ACM, 6. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. A. Steed. 2006. Towards a General Model for Selection in Virtual Environments. In 3D User Interfaces (3DUI'06). 103--110. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Anthony Steed and Chris Parker. 2004. 3D selection strategies for head tracked and non-head tracked operation of spatially immersive displays. In 8th International Immersive Projection Technology Workshop. 13--14.Google ScholarGoogle Scholar
  39. Sophie Stellmach and Raimund Dachselt. 2012. Look & touch: gazesupported target acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. ACM, 2981--2990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Vildan Tanriverdi and Robert JK Jacob. 2000. Interacting with eye movements in virtual environments. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems. ACM, 265--272. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. E Viirre, D Tweed, K Milner, and T Vilis. 1986. A reexamination of the gain of the vestibuloocular reflex. Journal of Neurophysiology 56, 2 (1986), 439--450.Google ScholarGoogle ScholarCross RefCross Ref
  42. Rui I. Wang, Brandon Pelfrey, Andrew T. Duchowski, and Donald H. House. 2014. Online 3D Gaze Localization on Stereoscopic Displays. ACM Trans. Appl. Percept. 11, 1, Article 3 (April 2014), 21 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Martin Weier, Thorsten Roth, André Hinkenjann, and Philipp Slusallek. 2018. Predicting the Gaze Depth in Head-mounted Displays Using Multiple Feature Regression. In Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications (ETRA '18). ACM, New York, NY, USA, Article 19, 9 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Resolving Target Ambiguity in 3D Gaze Interaction through VOR Depth Estimation

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        CHI '19: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
        May 2019
        9077 pages
        ISBN:9781450359702
        DOI:10.1145/3290605

        Copyright © 2019 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 2 May 2019

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        CHI '19 Paper Acceptance Rate703of2,958submissions,24%Overall Acceptance Rate6,199of26,314submissions,24%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format .

      View HTML Format