Skip to main content
Log in

Assessing hands-free interactions for VR using eye gaze and electromyography

  • Original Article
  • Published:
Virtual Reality Aims and scope Submit manuscript

Abstract

With the increasing popularity of virtual reality (VR) technologies, more efforts have been going into developing new input methods. While physical controllers are widely used, more novel techniques, such as eye tracking, are now commercially available. In our work, we investigate the use of physiological signals as input to enhance VR experiences. We present a system using gaze tracking and electromyography on a user’s forearm to make selection tasks in virtual spaces more efficient. In a study with 16 participants, we compared five different input techniques using a Fitts’ law task: Using gaze tracking for cursor movement in combination with forearm contractions for making selections was superior to using an HTC Vive controller, Xbox gamepad, dwelling time, and eye-gaze dwelling time. To explore application scenarios and collect qualitative feedback, we further developed and evaluated a game with our input technique. Our findings inform the design of applications that use eye-gaze tracking and forearm muscle movements for effective user input in VR.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

References

  • Barry DT, Gordon KE, Hinton GG (1990) Acoustic and surface EMG diagnosis of pediatric muscle disease. Muscle Nerve 13(4):286–290

    Article  Google Scholar 

  • Benko H, Saponas TS, Morris D, Tan D (2009) Enhancing input on and above the interactive surface with muscle sensing. In: Proceedings of the ACM international conference on interactive tabletops and surfaces—ITS ’09, p 93. https://doi.org/10.1145/1731903.1731924. http://portal.acm.org/citation.cfm?doid=1731903.1731924

  • Cardoso J (2016) Comparison of gesture, gamepad, and gaze-based locomotion for VR worlds. In: Proceedings of the 22nd ACM conference on virtual reality software and technology. ACM, pp 319–320

  • Chatterjee I, Xiao R, Harrison C (2015) Gaze + gesture : expressive , precise and targeted free-space interactions. In: Proceedings of the 2015 ACM on international conference on multimodal interaction (C), pp 131–138. https://doi.org/10.1145/2818346.2820752

  • Chin CA, Barreto A, Cremades JG, Adjouadi M (2008) Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities. J Rehabil Res Dev 45(1):161

    Article  Google Scholar 

  • Cloudhead G (2015) VR Navigation. http://cloudheadgames.com/cloudhead/vr-navigation/

  • Costanza E, Inverso SA, Allen R (2005) Toward subtle intimate interfaces for mobile devices using an EMG controller. In: Proceedings of the SIGCHI conference on Human factors in computing systems CHI 05:481. https://doi.org/10.1145/1054972.1055039. http://eprints.soton.ac.uk/270956/

  • Darken RP, Cockayne WR, Carmein D (1997) The omni-directional treadmill: a locomotion device for virtual worlds. In: Proceedings of the 10th annual ACM symposium on User interface software and technology—UIST ’97, pp 213–221. https://doi.org/10.1145/263407.263550. http://dl.acm.org/citation.cfm?id=263550

  • Henze N, Rukzio E, Boll S (2011) 100,000,000 taps: analysis and improvement of touch performance in the large. In: Proceedings of the 13th international conference on human computer interaction with mobile devices and services. ACM, pp 133–142

  • Hernandez Arieta A, Katoh R, Yokoi H, Wenwei Y (2006) Development of a multi-DOF electromyography prosthetic system using the adaptive joint mechanism. Appl Bionics Biomech 3(2):101–111

    Article  Google Scholar 

  • Hernandez-Rebollar JL, Kyriakopoulos N, Lindeman RW (2002) The AcceleGlove: a whole-hand input device for virtual reality. In: ACM SIGGRAPH 2002 conference abstracts and applications. ACM, p 259

  • ISO (1998) Ergonomic requirements for office work with visual display terminals (VDTs)—part 11: guidance on usability. ISO 9241-11:1998, p 22

  • Jacob RJK (1990) What you look at is what you get: eye movement-based interaction techniques. In: Proceedings of the SIGCHI conference on Human factors in computing systems. ACM, pp 11–18

  • Jacobsen SC, Jerard RB (1974) Computational requirements for control of the Utah arm. In: Proceedings of the 1974 annual conference, vol 1. ACM, pp 149–155

  • Jones BR, Benko H, Ofek E, Wilson AD (2013) IllumiRoom: peripheral projected illusions for interactive experiences. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 869–878

  • Kiguchi K, Tanaka T, Fukuda T (2004) Neuro-fuzzy control of a robotic exoskeleton with EMG signals. IEEE Trans Fuzzy Syst 12(4):481–490

    Article  Google Scholar 

  • Kim YR, Kim GJ (2017) HoVR-type: smartphone as a typing interface in VR using hovering. In: 2017 IEEE international conference on consumer electronics (ICCE). IEEE, pp 200–203

  • Lee JS, Lee SE, Jang SY, Park KS (2005) A simplified hand gesture interface for spherical manipulation in virtual environments. In: Proceedings of the 2005 international conference on Augmented tele-existence. ACM, p 284

  • MacKenzie IS, Soukoreff RW (2002) Text entry for mobile computing: models and methods, theory and practice. Hum Comput Interact 17(2–3):147–198

    Article  Google Scholar 

  • Miniotas D (2000) Application of Fitts’ law to eye gaze interaction. In: CHI ’00 extended abstracts on Human factors in computer systems—CHI ’00, pp 339–340. https://doi.org/10.1145/633482.633496. http://portal.acm.org/citation.cfm?doid=633292.633496

  • Moseley JRJB, Jobe FW, Pink M, Perry J, Tibone J (1992) EMG analysis of the scapular muscles during a shoulder rehabilitation program. Am J Sports Med 20(2):128–134

    Article  Google Scholar 

  • Myo (2013) Myo Gesture Control Armband. http://www.myo.com/techspecs

  • Oculus (2015) The Rift’s Recommended Spec, PC SDK 0.6 Released, and Mobile VR Jam Voting. https://www3.oculus.com/en-us/blog/the-rifts-recommended-spec-pc-sdk-0-6-released-and-mobile-vr-jam-voting/

  • Pai YS, Tag B, Outram B, Vontin N, Sugiura K, Kunze K (2016) GazeSim: simulating foveated rendering using depth in eye gaze for VR. In: ACM SIGGRAPH 2016 Posters. ACM, p 75

  • Ramcharitar A, Teather RJ (2017) A Fitts’ law evaluation of video game controllers: thumbstick, touchpad and gyrosensor. In: Proceedings of the 2016 CHI conference extended abstracts on human factors in computing systems. ACM, New York, NY, CHI EA ’17, pp 2860–2866. https://doi.org/10.1145/3027063.3053213. http://doi.acm.org/10.1145/3027063.3053213

  • Saponas TS, Tan DS, Morris D, Balakrishnan R, Turner J, Landay JA (2009) Enabling always-available input with muscle–computer interfaces. In: Proceedings of the 22nd annual ACM symposium on User interface software and technology, pp 167–176. https://doi.org/10.1145/1622176.1622208

  • Saraiji Y (2016) PupilHMDCalibration. https://github.com/mrayy/PupilHMDCalibration

  • Saraiji Y, Sugimoto S, Fernando CL, Minamizawa K, Tachi S (2016) Layered telepresence: simultaneous multi presence experience using eye gaze based perceptual awareness blending. In: ACM SIGGRAPH 2016 posters. ACM, p 20

  • Slambekova D, Bailey R, Geigel J (2012) Gaze and gesture based object manipulation in virtual worlds. In: Proceedings of the 18th ACM symposium on virtual reality software and technology. ACM, pp 203–204

  • Unity (2015) User interfaces for VR. https://unity3d.com/learn/tutorials/topics/virtual-reality/user-interfaces-vr

  • Ware C, Mikaelian HH (1986) An evaluation of an eye tracker as a device for computer input. ACM SIGCHI Bull 17(SI):183–188. https://doi.org/10.1145/30851.275627

    Article  Google Scholar 

  • Williams B, Bailey S, Narasimham G, Li M, Bodenheimer B (2011) Evaluation of walking in place on a Wii balance board to explore a virtual environment. ACM Trans Appl Percept 8(3):1–14. https://doi.org/10.1145/2010325.2010329

    Article  Google Scholar 

  • Wilson AD, Cutrell E (2005) Flowmouse: a computer vision-based pointing and gesture input device. In: Human–computer interaction-INTERACT 2005, vol 3585, pp 565–578. https://doi.org/10.1007/11555261. http://www.springerlink.com/index/DJ6AENMGBTYBC0EE.pdf

  • Zander TO, Gaertner M, Kothe C, Vilimek R (2010) Combining eye gaze input with a brain? Computer interface for touchless human? Computer interaction. Int J Hum Comput Interact 27(1):38–51. https://doi.org/10.1080/10447318.2011.535752

    Article  Google Scholar 

  • Zhang X, MacKenzie IS (2007) Evaluating eye tracking with ISO 9241—part 9. In: Proceedings of the 12th international conference on human-computer interaction: intelligent multimodal interaction environments, HCI’07. Springer, Berlin, pp 779–788. http://dl.acm.org/citation.cfm?id=1769590.1769678

Download references

Acknowledgements

This work was supported by the JSPS KAKENHI Grant Number 18H03278.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yun Suen Pai.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mp4 37248 KB)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Pai, Y.S., Dingler, T. & Kunze, K. Assessing hands-free interactions for VR using eye gaze and electromyography. Virtual Reality 23, 119–131 (2019). https://doi.org/10.1007/s10055-018-0371-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10055-018-0371-2

Keywords

Navigation