ABSTRACT
Head and eye movement can be leveraged to improve the user's interaction repertoire for wearable displays. Head movements are deliberate and accurate, and provide the current state-of-the-art pointing technique. Eye gaze can potentially be faster and more ergonomic, but suffers from low accuracy due to calibration errors and drift of wearable eye-tracking sensors. This work investigates precise, multimodal selection techniques using head motion and eye gaze. A comparison of speed and pointing accuracy reveals the relative merits of each method, including the achievable target size for robust selection. We demonstrate and discuss example applications for augmented reality, including compact menus with deep structure, and a proof-of-concept method for on-line correction of calibration drift.
Supplemental Material
- Rowel Atienza, Ryan Blonna, Maria Isabel Saludares, Joel Casimiro, and Vivencio Fuentes. 2016. Interaction techniques using head gaze for virtual reality. In Proceedings - 2016 IEEE Region 10 Symposium, TENSYMP 2016, 110--114.Google ScholarCross Ref
- Mihai Bace, Teemu Leppänen, David Gil De Gomez, and Argenis Ramirez Gomez. 2016. ubiGaze?: Ubiquitous Augmented Reality Messaging Using Gaze Gestures. In SIGGRAPH ASIA, Article no. 11. Google ScholarDigital Library
- Richard Bates and Howell Istance. 2003. Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices. Universal Access in the Information Society 2, 3: 280-- 290. Google ScholarDigital Library
- Ana M. Bernardos, David Gómez, and José R. Casar. 2016. A Comparison of Head Pose and Deictic Pointing Interaction Methods for Smart Environments. International Journal of Human-Computer Interaction 32, 4: 325--351.Google ScholarCross Ref
- Martin Bichsel and Alex Pentland. 1993. Automatic interpretation of human head movements.Google Scholar
- Doug Bowman, Ernst Kruijff, Joseph J. LaViola, and Ivan Poupyrev. 2004. 3D User Interfaces: Theory and Practice. Addison Wesley Longman Publishing Co., Redwood City. Google ScholarDigital Library
- Stephen Brewster, Joanna Lumsden, Marek Bell, Malcolm Hall, and Stuart Tasker. 2003. Multimodal "eyes-free" interaction techniques for wearable devices. Proceedings of the conference on Human factors in computing systems - CHI '03, 5: 473. Google ScholarDigital Library
- Benedetta. Cesqui, Rolf van de Langenberg, Francesco. Lacquaniti, and Andrea. D'Avella. 2013. A novel method for measuring gaze orientation in space in unrestrained head conditions. Journal of Vision 13, 8: 28:1--22.Google Scholar
- Ishan Chatterjee, Robert Xiao, and Chris Harrison. 2015. Gaze + Gesture?: Expressive, Precise and Targeted Free-Space Interactions. In Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, 131--138. Google ScholarDigital Library
- Ngip Khean Chuan and Ashok Sivaji. 2012. Combining eye gaze and hand tracking for pointer control in HCI: Developing a more robust and accurate interaction system for pointer positioning and clicking. In CHUSER 2012 - 2012 IEEE Colloquium on Humanities, Science and Engineering Research, 172-- 176.Google ScholarCross Ref
- Rory M.S. Clifford, Nikita Mae B. Tuanquin, and Robert W. Lindeman. 2017. Jedi ForceExtension: Telekinesis as a Virtual Reality interaction metaphor. In 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings, 239--240.Google Scholar
- Nathan Cournia, John D. Smith, and Andrew T. Duchowski. 2003. Gaze- vs. Hand-based Pointing in Virtual Environments. In Proc. CHI '03 Extended Abstracts on Human Factors in Computer Systems (CHI '03), 772--773. Google ScholarDigital Library
- Heiko Drewes and Albrecht Schmidt. 2007. Interacting with the computer using gaze gestures. In Proc. of the International Conference on Human-computer interaction'07, 475--488. Google ScholarDigital Library
- Neven A M ElSayed, Bruce H. Thomas, Kim Marriott, Julia Piantadosi, and Ross T. Smith. 2016. Situated Analytics: Demonstrating immersive analytical tools with Augmented Reality. Journal of Visual Languages and Computing 36, C: 13--23. Google ScholarDigital Library
- Barrett Ens, Fraser Anderson, Tovi Grossman, Michelle Annett, Pourang Irani, and George Fitzmaurice. 2017. Ivy: Exploring Spatially Situated Visual Programming for Authoring and Understanding Intelligent Environments. Proceedings of the 43rd Graphics Interface Conference: 156--162. Google ScholarDigital Library
- Barrett M. Ens, Rory Finnegan, and Pourang P. Irani. 2014. The personal cockpit: a spatial interface for effective task switching on head-worn displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 3171--3180. Google ScholarDigital Library
- Augusto Esteves, David Verweij, Liza Suraiya, Rasal Islam, Youryang Lee, and Ian Oakley. 2017. SmoothMoves?: Smooth Pursuits Head Movements for Augmented Reality. In Proceedings of the 30th Annual ACM Symposium on User Interface Software & Technology, 167--178. Google ScholarDigital Library
- Anna Maria Feit, Shane Williams, Arturo Toledo, Ann Paradiso, Harish Kulkarni, Shaun Kane, and Meredith Ringel Morris. 2017. Toward Everyday Gaze Input: Accuracy and Precision of Eye Tracking and Implications for Design. CHI '17 Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems: 1118--1130. Google ScholarDigital Library
- FOVE. FOVE Eye Tracking Virtual Reality Headset. Retrieved September 19, 2017 from https://www.getfove.com/Google Scholar
- Sven-Thomas Graupner and Sebastian Pannasch. 2014. Continuous Gaze Cursor Feedback in Various Tasks: Influence on Eye Movement Behavior, Task Performance and Subjective Distraction.. Springer, Cham, 323--329.Google Scholar
- Gyration. Gyration Air Mouse Input Devices. Retrieved September 18, 2017 from https://www.gyration.com/Google Scholar
- Jeremy Hales, David Rozado, and Diako Mardanbegi. 2013. Interacting with Objects in the Environment by Gaze and Hand Gestures. In Proceedings of the 3rd International Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction, 1--9.Google Scholar
- Sandra G. Hart and Lowell. E Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of empirical and theoretical research. Advances in Psychology 1: 139--183.Google ScholarCross Ref
- Valentin Heun, James Hobin, and Pattie Maes. 2013. Reality editor: programming smarter objects. Proceedings of the 2013 ACM conference on Pervasive and Ubiquitous Computing adjunct publication (UbiComp '13 Adjunct): 307--310. Google ScholarDigital Library
- Aulikki Hyrskykari, Howell Istance, and Stephen Vickers. 2012. Gaze gestures or dwell-based interaction? In Proceedings of the Symposium on Eye Tracking Research and Applications - ETRA '12, 229-- 232. Google ScholarDigital Library
- Howell Istance, Aulikki Hyrskykari, Lauri Immonen, Santtu Mansikkamaa, and Stephen Vickers. 2010. Designing gaze gestures for gaming: an Investigation of Performance. Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications - ETRA '10 1, 212: 323. Google ScholarDigital Library
- Rob Jacob and Sophie Stellmach. 2016. Interaction technologies: What you look at is what you get: Gazebased user interfaces. Interactions 23, 5: 62--65. Google ScholarDigital Library
- Robert J. K. Jacob. 1990. What you look at is what you get: eye movement-based interaction techniques. Proceedings of the SIGCHI conference on Human factors in computing systems Empowering people CHI '90: 11--18. Google ScholarDigital Library
- Richard J. Jagacinski and Donald L. Monk. 1985. Fitts' Law in two dimensions with hand and head movements. Journal of motor behavior 17, 1: 77--95.Google ScholarCross Ref
- Shahram Jalaliniya, Diako Mardanbegi, and Thomas Pederson. 2015. MAGIC Pointing for Eyewear Computers. Proceedings of the 2015 ACM International Symposium on Wearable Computers ISWC '15: 155--158. Google ScholarDigital Library
- Shahram Jalaliniya, Diako Mardanbeigi, Thomas Pederson, and Dan Witzner Hansen. 2014. Head and eye movement as pointing modalities for eyewear computers. Proceedings - 11th International Conference on Wearable and Implantable Body Sensor Networks Workshops, BSN Workshops 2014: 50--53. Google ScholarDigital Library
- Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing Adjunct Publication - UbiComp '14 Adjunct, 1151-- 1160. Google ScholarDigital Library
- Mohamed Khamis, Axel Hoesl, Alexander Klimczak, Martin Reiss, Florian Alt, and Andreas Bulling. 2017. EyeScout?: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays. In UIST '17 Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, 155--166. Google ScholarDigital Library
- Linnéa Larsson, Andrea Schwaller, Marcus Nyström, and Martin Stridh. 2016. Head movement compensation and multi-modal event detection in eyetracking data for unconstrained head movements. Journal of Neuroscience Methods 274: 13--26.Google ScholarCross Ref
- Chiuhsiang Joe Lin, Sui Hua Ho, and Yan Jyun Chen. 2015. An investigation of pointing postures in a 3D stereoscopic environment. Applied Ergonomics 48: 154--163.Google ScholarCross Ref
- Alfredo Liverani, Giancarlo Amati, and Gianni Caligiana. 2016. A CAD-augmented Reality Integrated Environment for Assembly Sequence Check and Interactive Validation. Concurrent Engineering 12, 1: 67--77.Google ScholarCross Ref
- Rainer Malkewitz. 1998. Head pointing and speech control as a hands-free interface to desktop computing. In Proceedings of the third international ACM conference on Assistive technologies (Proceeding Assets '98), 182--188. Google ScholarDigital Library
- Diako Mardanbegi, Dan Witzner Hansen, and Thomas Pederson. 2012. Eye-based head gestures movements. In Proceedings of the Symposium on Eye Tracking Research and Applications, 139--146. Google ScholarDigital Library
- Microsoft. The leader in Mixed Reality Technology | HoloLens. Retrieved September 14, 2017 from https://www.microsoft.com/en-us/hololensGoogle Scholar
- Mark R Mine. 1995. Virtual environment interaction techniques.Google Scholar
- Richard A. Monty and John W. Senders. 1976. Eye Movements and Psychological Processes. Lawrence Erlbaum Associates, Hillsdale, NJ.Google Scholar
- Mathieu Nancel, Olivier Chapuis, Emmanuel Pietriga, Xing-Dong Yang, Pourang P. Irani, and Michel Beaudouin-Lafon. 2013. High-precision pointing on large wall displays using small handheld devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '13, 831. Google ScholarDigital Library
- Tomi Nukarinen, Jari Kangas, Oleg Špakov, Poika Isokoski, Deepak Akkil, Jussi Rantala, and Roope Raisamo. 2016. Evaluation of HeadTurn - An Interaction Technique Using the Gaze and Head Turns. Proceedings of the 9th Nordic Conference on HumanComputer Interaction - NordiCHI '16: 1--8. Google ScholarDigital Library
- Oculus. Oculus Rift. Retrieved September 19, 2017 from https://www.oculus.com/Google Scholar
- Hyung Min Park, Seok Han Lee, and Jong Soo Choi. 2008. Wearable augmented reality system using gaze interaction. In Proceedings - 7th IEEE International Symposium on Mixed and Augmented Reality 2008, ISMAR 2008, 175--176. Google ScholarDigital Library
- Ken Pfeuffer, Jason Alexander, Ming Ki Chong, and Hans Gellersen. 2014. Gaze-touch: Combining Gaze with Multi-touch for Interaction on the Same Surface. In Proceedings of the 27th annual ACM symposium on User interface software and technology - UIST '14, 509--518. Google ScholarDigital Library
- Ken Pfeuffer, Jason Alexander, Ming Ki Chong, Yanxia Zhang, and Hans Gellersen. 2015. GazeShifting: Direct-Indirect Input with Pen and Touch Modulated by Gaze. Proceedings of the 28th annual ACM symposium on User interface software and technology - UIST '15: 373--383. Google ScholarDigital Library
- Ken Pfeuffer, Jason Alexander, and Hans Gellersen. 2015. Gaze+ touch vs. touch: what's the trade-off when using gaze to extend touch to remote displays? In Human-Computer Interaction -- INTERACT 2015, 349--367.Google Scholar
- Ken Pfeuffer and Hans Gellersen. 2016. Gaze and Touch Interaction on Tablets. Proceedings of the 29th Annual Symposium on User Interface Software and Technology - UIST '16: 301--311. Google ScholarDigital Library
- Thammathip Piumsomboon, Gun Lee, Robert W. Lindeman, and Mark Billinghurst. 2017. Exploring natural eye-gaze-based interaction for immersive virtual reality. 2017 IEEE Symposium on 3D User Interfaces, 3DUI 2017 - Proceedings: 36--39.Google ScholarCross Ref
- Pupil Labs. Pupil Labs. Retrieved September 19, 2017 from https://pupil-labs.com/Google Scholar
- Yuanyuan Qian and Robert J Teather. 2017. The eyes don't have It?: An empirical comparison of head-based and eye-based selection in virtual reality. In Proceedings of the ACM Symposium on Spatial User Interaction, 91--98. Google ScholarDigital Library
- Marcos Serrano, Barrett Ens, Xing-Dong Yang, and Pourang Irani. 2015. Gluey: Developing a Head-Worn Display Interface to Unify the Interaction Experience in Distributed Display Environments. In Proceedings of the 17th International Conference on HumanComputer Interaction with Mobile Devices and Services - MobileHCI '15, 161--171. Google ScholarDigital Library
- Linda E. Sibert and Robert J. K. Jacob. 2000. Evaluation of eye gaze interaction. Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '00: 281--288. Google ScholarDigital Library
- Nikolaos Sidorakis, George Alex Koulieris, and Katerina Mania. 2015. Binocular eye-tracking for the control of a 3D immersive multimedia user interface. In 2015 IEEE 1st Workshop on Everyday Virtual Reality, WEVR 2015, 15--18.Google ScholarCross Ref
- Oleg Špakov, Poika Isokoski, and Päivi Majaranta. 2014. Look and Lean?: Accurate Head-Assisted Eye Pointing. Proceedings of the ETRA conference 1, 212: 35--42. Google ScholarDigital Library
- Oleg Špakov and Päivi Majaranta. 2012. Enhanced Gaze Interaction Using Simple Head Gestures. In Proceedings of the 2012 ACM Conference on Ubiquitous Computing - UbiComp '12, 705--710. Google ScholarDigital Library
- Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch?: Gaze-supported Target Acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 2981--2990. Google ScholarDigital Library
- Sophie Stellmach and Raimund Dachselt. 2013. Still Looking: Investigating Seamless Gaze-supported Selection, Positioning, and Manipulation of Distant Targets. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '13, 285-- 294. Google ScholarDigital Library
- Yusuke Sugano and Andreas Bulling. 2015. SelfCalibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency. In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15)., 363--372. Google ScholarDigital Library
- Vildan Tanriverdi and Robert J. K. Jacob. 2000. Interacting with eye movements in virtual environments. Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '00 2, 1: 265--272. Google ScholarDigital Library
- Boris Velichkovsky, Andreas Sprenger, and Pieter Unema. 1997. Towards Gaze-Mediated Interaction Collecting Solutions of the "Midas touch problem." Proceedings of the International Conference on Human-Computer Interaction (INTERACT'97): 509-- 516. Google ScholarDigital Library
- Eduardo Velloso, Markus Wirth, Christian Weichel, Augusto Esteves, and Hans Gellersen. 2016. AmbiGaze: Direct Control of Ambient Devices by Gaze. Proceedings of the 2016 ACM Conference on Designing Interactive Systems - DIS '16: 812--817. Google ScholarDigital Library
- Melodie Mélodie Vidal, Andreas Bulling, and Hans Gellersen. 2013. Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, 439--448. Google ScholarDigital Library
- X. Glass. Retrieved September 19, 2017 from https://x.company/glass/Google Scholar
- Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and gaze input cascaded (MAGIC) pointing. In Proceedings of the SIGCHI conference on Human factors in computing systems - CHI '99, 246--253. Google ScholarDigital Library
Index Terms
- Pinpointing: Precise Head- and Eye-Based Target Selection for Augmented Reality
Recommendations
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality
Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article ...
Radi-Eye: Hands-Free Radial Interfaces for 3D Interaction using Gaze-Activated Head-Crossing
CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing SystemsEye gaze and head movement are attractive for hands-free 3D interaction in head-mounted displays, but existing interfaces afford only limited control. Radi-Eye is a novel pop-up radial interface designed to maximise expressiveness with input from only ...
Comments