skip to main content
10.1145/3173574.3173655acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article
Best Paper

Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality

Authors Info & Claims
Published:19 April 2018Publication History

ABSTRACT

Head and eye movement can be leveraged to improve the user's interaction repertoire for wearable displays. Head movements are deliberate and accurate, and provide the current state-of-the-art pointing technique. Eye gaze can potentially be faster and more ergonomic, but suffers from low accuracy due to calibration errors and drift of wearable eye-tracking sensors. This work investigates precise, multimodal selection techniques using head motion and eye gaze. A comparison of speed and pointing accuracy reveals the relative merits of each method, including the achievable target size for robust selection. We demonstrate and discuss example applications for augmented reality, including compact menus with deep structure, and a proof-of-concept method for on-line correction of calibration drift.

Skip Supplemental Material Section

Supplemental Material

pn1537-file5.mp4

mp4

10 MB

pn1537.mp4

mp4

273.3 MB

pn1537.mp4

mp4

273.3 MB

References

  1. Rowel Atienza, Ryan Blonna, Maria Isabel Saludares, Joel Casimiro, and Vivencio Fuentes. 2016. Interaction techniques using head gaze for virtual reality. In Proceedings - 2016 IEEE Region 10 Symposium, TENSYMP 2016, 110--114.Google ScholarGoogle ScholarCross RefCross Ref
  2. Mihai Bace, Teemu Leppänen, David Gil De Gomez, and Argenis Ramirez Gomez. 2016. ubiGaze?: Ubiquitous Augmented Reality Messaging Using Gaze Gestures. In SIGGRAPH ASIA, Article no. 11. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Richard Bates and Howell Istance. 2003. Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices. Universal Access in the Information Society 2, 3: 280-- 290. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Ana M. Bernardos, David Gómez, and José R. Casar. 2016. A Comparison of Head Pose and Deictic Pointing Interaction Methods for Smart Environments. International Journal of Human-Computer Interaction 32, 4: 325--351.Google ScholarGoogle ScholarCross RefCross Ref
  5. Martin Bichsel and Alex Pentland. 1993. Automatic interpretation of human head movements.Google ScholarGoogle Scholar
  6. Doug Bowman, Ernst Kruijff, Joseph J. LaViola, and Ivan Poupyrev. 2004. 3D User Interfaces: Theory and Practice. Addison Wesley Longman Publishing Co., Redwood City. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Stephen Brewster, Joanna Lumsden, Marek Bell, Malcolm Hall, and Stuart Tasker. 2003. Multimodal "eyes-free" interaction techniques for wearable devices. Proceedings of the conference on Human factors in computing systems - CHI '03, 5: 473. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Benedetta. Cesqui, Rolf van de Langenberg, Francesco. Lacquaniti, and Andrea. D'Avella. 2013. A novel method for measuring gaze orientation in space in unrestrained head conditions. Journal of Vision 13, 8: 28:1--22.Google ScholarGoogle Scholar
  9. Ishan Chatterjee, Robert Xiao, and Chris Harrison. 2015. Gaze + Gesture?: Expressive, Precise and Targeted Free-Space Interactions. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, 131--138. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Ngip Khean Chuan and Ashok Sivaji. 2012. Combining eye gaze and hand tracking for pointer control in HCI: Developing a more robust and accurate interaction system for pointer positioning and clicking. In CHUSER 2012 - 2012 IEEE Colloquium on Humanities, Science and Engineering Research, 172-- 176.Google ScholarGoogle ScholarCross RefCross Ref
  11. Rory M.S. Clifford, Nikita Mae B. Tuanquin, and Robert W. Lindeman. 2017. Jedi ForceExtension: Telekinesis as a Virtual Reality interaction metaphor. In 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings, 239--240.Google ScholarGoogle Scholar
  12. Nathan Cournia, John D. Smith, and Andrew T. Duchowski. 2003. Gaze- vs. Hand-based Pointing in Virtual Environments. In Proc. CHI '03 Extended Abstracts on Human Factors in Computer Systems (CHI '03), 772--773. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. In Proc. of the International Conference on Human-computer interaction'07, 475--488. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Neven A M ElSayed, Bruce H. Thomas, Kim Marriott, Julia Piantadosi, and Ross T. Smith. 2016. Situated Analytics: Demonstrating immersive analytical tools with Augmented Reality. Journal of Visual Languages and Computing 36, C: 13--23. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Barrett Ens, Fraser Anderson, Tovi Grossman, Michelle Annett, Pourang Irani, and George Fitzmaurice. 2017. Ivy: Exploring Spatially Situated Visual Programming for Authoring and Understanding Intelligent Environments. Proceedings of the 43rd Graphics Interface Conference: 156--162. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Barrett M. Ens, Rory Finnegan, and Pourang P. Irani. 2014. The personal cockpit: a spatial interface for effective task switching on head-worn displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 3171--3180. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Augusto Esteves, David Verweij, Liza Suraiya, Rasal Islam, Youryang Lee, and Ian Oakley. 2017. SmoothMoves?: Smooth Pursuits Head Movements for Augmented Reality. In Proceedings of the 30th Annual ACM Symposium on User Interface Software & Technology, 167--178. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design. CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems: 1118--1130. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. FOVE. FOVE Eye Tracking Virtual Reality Headset. Retrieved September 19, 2017 from https://www.getfove.com/Google ScholarGoogle Scholar
  20. Sven-Thomas Graupner and Sebastian Pannasch. 2014. Continuous Gaze Cursor Feedback in Various Tasks: Influence on Eye Movement Behavior, Task Performance and Subjective Distraction.. Springer, Cham, 323--329.Google ScholarGoogle Scholar
  21. Gyration. Gyration Air Mouse Input Devices. Retrieved September 18, 2017 from https://www.gyration.com/Google ScholarGoogle Scholar
  22. Jeremy Hales, David Rozado, and Diako Mardanbegi. 2013. Interacting with Objects in the Environment by Gaze and Hand Gestures. In Proceedings of the 3rd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction, 1--9.Google ScholarGoogle Scholar
  23. Sandra G. Hart and Lowell. E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Advances in Psychology 1: 139--183.Google ScholarGoogle ScholarCross RefCross Ref
  24. Valentin Heun, James Hobin, and Pattie Maes. 2013. Reality editor: programming smarter objects. Proceedings of the 2013 ACM conference on Pervasive and Ubiquitous Computing adjunct publication (UbiComp '13 Adjunct): 307--310. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Aulikki Hyrskykari, Howell Istance, and Stephen Vickers. 2012. Gaze gestures or dwell-based interaction? In Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '12, 229-- 232. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, and Stephen Vickers. 2010. Designing gaze gestures for gaming: an Investigation of Performance. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications - ETRA '10 1, 212: 323. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Rob Jacob and Sophie Stellmach. 2016. Interaction technologies: What you look at is what you get: Gazebased user interfaces. Interactions 23, 5: 62--65. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Robert J. K. Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. Proceedings of the SIGCHI conference on Human factors in computing systems Empowering people CHI '90: 11--18. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Richard J. Jagacinski and Donald L. Monk. 1985. Fitts' Law in two dimensions with hand and head movements. Journal of motor behavior 17, 1: 77--95.Google ScholarGoogle ScholarCross RefCross Ref
  30. Shahram Jalaliniya, Diako Mardanbegi, and Thomas Pederson. 2015. MAGIC Pointing for Eyewear Computers. Proceedings of the 2015 ACM International Symposium on Wearable Computers ISWC '15: 155--158. Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Shahram Jalaliniya, Diako Mardanbeigi, Thomas Pederson, and Dan Witzner Hansen. 2014. Head and eye movement as pointing modalities for eyewear computers. Proceedings - 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops, BSN Workshops 2014: 50--53. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication - UbiComp '14 Adjunct, 1151-- 1160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Mohamed Khamis, Axel Hoesl, Alexander Klimczak, Martin Reiss, Florian Alt, and Andreas Bulling. 2017. EyeScout?: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays. In UIST '17 Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, 155--166. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Linnéa Larsson, Andrea Schwaller, Marcus Nyström, and Martin Stridh. 2016. Head movement compensation and multi-modal event detection in eyetracking data for unconstrained head movements. Journal of Neuroscience Methods 274: 13--26.Google ScholarGoogle ScholarCross RefCross Ref
  35. Chiuhsiang Joe Lin, Sui Hua Ho, and Yan Jyun Chen. 2015. An investigation of pointing postures in a 3D stereoscopic environment. Applied Ergonomics 48: 154--163.Google ScholarGoogle ScholarCross RefCross Ref
  36. Alfredo Liverani, Giancarlo Amati, and Gianni Caligiana. 2016. A CAD-augmented Reality Integrated Environment for Assembly Sequence Check and Interactive Validation. Concurrent Engineering 12, 1: 67--77.Google ScholarGoogle ScholarCross RefCross Ref
  37. Rainer Malkewitz. 1998. Head pointing and speech control as a hands-free interface to desktop computing. In Proceedings of the third international ACM conference on Assistive technologies (Proceeding Assets '98), 182--188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based head gestures movements. In Proceedings of the Symposium on Eye Tracking Research and Applications, 139--146. Google ScholarGoogle ScholarDigital LibraryDigital Library
  39. Microsoft. The leader in Mixed Reality Technology | HoloLens. Retrieved September 14, 2017 from https://www.microsoft.com/en-us/hololensGoogle ScholarGoogle Scholar
  40. Mark R Mine. 1995. Virtual environment interaction techniques.Google ScholarGoogle Scholar
  41. Richard A. Monty and John W. Senders. 1976. Eye Movements and Psychological Processes. Lawrence Erlbaum Associates, Hillsdale, NJ.Google ScholarGoogle Scholar
  42. Mathieu Nancel, Olivier Chapuis, Emmanuel Pietriga, Xing-Dong Yang, Pourang P. Irani, and Michel Beaudouin-Lafon. 2013. High-precision pointing on large wall displays using small handheld devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '13, 831. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Tomi Nukarinen, Jari Kangas, Oleg Špakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Evaluation of HeadTurn - An Interaction Technique Using the Gaze and Head Turns. Proceedings of the 9th Nordic Conference on HumanComputer Interaction - NordiCHI '16: 1--8. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Oculus. Oculus Rift. Retrieved September 19, 2017 from https://www.oculus.com/Google ScholarGoogle Scholar
  45. Hyung Min Park, Seok Han Lee, and Jong Soo Choi. 2008. Wearable augmented reality system using gaze interaction. In Proceedings - 7th IEEE International Symposium on Mixed and Augmented Reality 2008, ISMAR 2008, 175--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface. In Proceedings of the 27th annual ACM symposium on User interface software and technology - UIST '14, 509--518. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. GazeShifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze. Proceedings of the 28th annual ACM symposium on User interface software and technology - UIST '15: 373--383. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Ken Pfeuffer, Jason Alexander, and Hans Gellersen. 2015. Gaze+ touch vs. touch: what's the trade-off when using gaze to extend touch to remote displays? In Human-Computer Interaction -- INTERACT 2015, 349--367.Google ScholarGoogle Scholar
  49. Ken Pfeuffer and Hans Gellersen. 2016. Gaze and Touch Interaction on Tablets. Proceedings of the 29th Annual Symposium on User Interface Software and Technology - UIST '16: 301--311. Google ScholarGoogle ScholarDigital LibraryDigital Library
  50. Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings: 36--39.Google ScholarGoogle ScholarCross RefCross Ref
  51. Pupil Labs. Pupil Labs. Retrieved September 19, 2017 from https://pupil-labs.com/Google ScholarGoogle Scholar
  52. Yuanyuan Qian and Robert J Teather. 2017. The eyes don't have It?: An empirical comparison of head-based and eye-based selection in virtual reality. In Proceedings of the ACM Symposium on Spatial User Interaction, 91--98. Google ScholarGoogle ScholarDigital LibraryDigital Library
  53. Marcos Serrano, Barrett Ens, Xing-Dong Yang, and Pourang Irani. 2015. Gluey: Developing a Head-Worn Display Interface to Unify the Interaction Experience in Distributed Display Environments. In Proceedings of the 17th International Conference on HumanComputer Interaction with Mobile Devices and Services - MobileHCI '15, 161--171. Google ScholarGoogle ScholarDigital LibraryDigital Library
  54. Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of eye gaze interaction. Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '00: 281--288. Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Nikolaos Sidorakis, George Alex Koulieris, and Katerina Mania. 2015. Binocular eye-tracking for the control of a 3D immersive multimedia user interface. In 2015 IEEE 1st Workshop on Everyday Virtual Reality, WEVR 2015, 15--18.Google ScholarGoogle ScholarCross RefCross Ref
  56. Oleg Špakov, Poika Isokoski, and Päivi Majaranta. 2014. Look and Lean?: Accurate Head-Assisted Eye Pointing. Proceedings of the ETRA conference 1, 212: 35--42. Google ScholarGoogle ScholarDigital LibraryDigital Library
  57. Oleg Špakov and Päivi Majaranta. 2012. Enhanced Gaze Interaction Using Simple Head Gestures. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing - UbiComp '12, 705--710. Google ScholarGoogle ScholarDigital LibraryDigital Library
  58. Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch?: Gaze-supported Target Acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2981--2990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  59. Sophie Stellmach and Raimund Dachselt. 2013. Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '13, 285-- 294. Google ScholarGoogle ScholarDigital LibraryDigital Library
  60. Yusuke Sugano and Andreas Bulling. 2015. SelfCalibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15)., 363--372. Google ScholarGoogle ScholarDigital LibraryDigital Library
  61. Vildan Tanriverdi and Robert J. K. Jacob. 2000. Interacting with eye movements in virtual environments. Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '00 2, 1: 265--272. Google ScholarGoogle ScholarDigital LibraryDigital Library
  62. Boris Velichkovsky, Andreas Sprenger, and Pieter Unema. 1997. Towards Gaze-Mediated Interaction Collecting Solutions of the "Midas touch problem." Proceedings of the International Conference on Human-Computer Interaction (INTERACT'97): 509-- 516. Google ScholarGoogle ScholarDigital LibraryDigital Library
  63. Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct Control of Ambient Devices by Gaze. Proceedings of the 2016 ACM Conference on Designing Interactive Systems - DIS '16: 812--817. Google ScholarGoogle ScholarDigital LibraryDigital Library
  64. Melodie Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 439--448. Google ScholarGoogle ScholarDigital LibraryDigital Library
  65. X. Glass. Retrieved September 19, 2017 from https://x.company/glass/Google ScholarGoogle Scholar
  66. Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '99, 246--253. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems
      April 2018
      8489 pages
      ISBN:9781450356206
      DOI:10.1145/3173574

      Copyright © 2018 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 19 April 2018

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      CHI '18 Paper Acceptance Rate666of2,590submissions,26%Overall Acceptance Rate6,199of26,314submissions,24%

      Upcoming Conference

      CHI '24
      CHI Conference on Human Factors in Computing Systems
      May 11 - 16, 2024
      Honolulu , HI , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader