ABSTRACT
Eye tracking is a promising input modality for interactive tabletops. However, issues such as eyelid occlusion and the viewing angle at distant positions present significant challenges for remote gaze tracking in this setting. We present the results of two studies that explore the way gaze interaction can be enabled. Our first study contributes the results from an empirical investigation of gaze accuracy on a large horizontal surface, finding gaze to be unusable close to the user (due to eyelid occlusion), accurate at arm's length, and only precise horizontally at large distances. In consideration of these results, we propose two solutions for the design of interactive systems that utilise remote gaze-tracking on the tabletop; multimodal segmentation and the use of X-Gaze-our novel technique-to interact with out-of-reach objects. Our second study evaluates and validates both these solutions in a Video-on-Demand application, presenting immediate opportunities for remote-gaze interaction on horizontal surfaces.
- Abednego, M., Lee, J.-H., Moon, W., and Park, J.-H. I-grabber: Expanding physical reach in a large-display tabletop environment through the use of a virtual grabber. In Proc. of ITS '09, ACM (2009), 61--64. Google ScholarDigital Library
- Ardito, C., Buono, P., Costabile, M. F., and Desolda, G. Interaction with large displays: A survey. ACM Comput. Surv. 47, 3 (Feb. 2015), 46:1--46:38. Google ScholarDigital Library
- Bader, T., Vogelgesang, M., and Klaus, E. Multimodal integration of natural gaze behavior for intention recognition during object manipulation. In Proc. of the 2009 Int. Conf. on Multimodal Interfaces, ICMI-MLMI '09, ACM (2009), 199--206. Google ScholarDigital Library
- Banerjee, A., Burstyn, J., Girouard, A., and Vertegaal, R. Pointable: An in-air pointing technique to manipulate out-of-reach targets on tabletops. In Proc. of ITS '11, ACM (2011), 11--20. Google ScholarDigital Library
- Bartindale, T., Harrison, C., Olivier, P., and Hudson, S. E. Surfacemouse: Supplementing multi-touch interaction with a virtual mouse. In Proc. of TEI '11, ACM (2011), 293--296. Google ScholarDigital Library
- Benko, H., Morris, M. R., Brush, A. B., and Wilson, A. D. Insights on interactive tabletops: A survey of researchers and developers. Tech. Rep. MSR-TR-2009--22, March 2009.Google Scholar
- Bezerianos, A., and Balakrishnan, R. The vacuum: Facilitating the manipulation of distant objects. In Proc. of CHI '05, ACM (2005), 361--370. Google ScholarDigital Library
- Collewijn, H., and Tamminga, E. P. Human smooth and saccadic eye movements during voluntary pursuit of different target motions on different backgrounds. The Journal of Physiology 351, 1 (1984), 217--250.Google ScholarCross Ref
- Esteves, A., Velloso, E., Bulling, A., and Gellersen, H. Orbits: Gaze interaction for smart watches using smooth pursuit eye movements. In Proc. of UIST '15, ACM (2015), 457--466. Google ScholarDigital Library
- Geller, T. Interactive tabletop exhibits in museums and galleries. Computer Graphics and Applications, IEEE 26, 5 (Sept 2006), 6--11. Google ScholarDigital Library
- Hardy, J., and Alexander, J. Toolkit support for interactive projected displays. In Proc. of MUM '12, ACM (2012), 42:1--42:10. Google ScholarDigital Library
- Holman, D. Gazetop: Interaction techniques for gaze-aware tabletops. In CHI '07 Extended Abstracts on Human Factors in Computing Systems, CHI EA '07, ACM (2007), 1657--1660. Google ScholarDigital Library
- Jacob, R. J. K. What you look at is what you get: Eye movement-based interaction techniques. In Proc. of CHI '90, ACM (1990), 11--18. Google ScholarDigital Library
- Lander, C., Gehring, S., Krüger, A., Boring, S., and Bulling, A. Gazeprojector: Accurate gaze estimation and seamless gaze interaction across multiple displays. In Proc. of UIST 2015 (2015), 395--404. Google ScholarDigital Library
- Majaranta, P. Communication and Text Entry by Gaze. IGI Global, 2012.Google ScholarCross Ref
- Marquardt, N., Jota, R., Greenberg, S., and Jorge, J. A. The continuous interaction space: Interaction techniques unifying touch and gesture on and above a digital surface. In Proc. of INTERACT'11, Springer-Verlag (2011), 461--476. Google ScholarDigital Library
- Mauderer, M., Daiber, F., and Krüger, A. Combining touch and gaze for distant selection in a tabletop setting. In CHI 2013: Workshop on Gaze Interaction in the Post-WIMP World (2013).Google Scholar
- Parker, J. K., Mandryk, R. L., and Inkpen, K. M. Integrating point and touch for interaction with digital tabletop displays. Computer Graphics and Applications, IEEE 26, 5 (Sept 2006), 28--35. Google ScholarDigital Library
- Pfeuffer, K., Alexander, J., Chong, M. K., and Gellersen, H. Gaze-touch: Combining gaze with multi-touch for interaction on the same surface. In Proc. of UIST '14, ACM (2014), 509--518. Google ScholarDigital Library
- Reetz, A., Gutwin, C., Stach, T., Nacenta, M., and Subramanian, S. Superflick: A natural and efficient technique for long-distance object placement on digital tables. In Proc. of GI '06, Canadian Information Processing Society (2006), 163--170. Google ScholarDigital Library
- Remy, C., Weiss, M., Ziefie, M., and Borchers, J. A pattern language for interactive tabletops in collaborative workspaces. In Proc. of EuroPLoP '10, ACM (2010), 9:1--9:48. Google ScholarDigital Library
- Rottach, K. G., Zivotofsky, A. Z., Das, V. E., Averbuch-Heller, L., Discenna, A. O., Poonyathalang, A., and Leigh, R. Comparison of horizontal, vertical and diagonal smooth pursuit eye movements in normal human subjects. Vision Research 36, 14 (1996), 2189--2195.Google ScholarCross Ref
- Ryall, K., Forlines, C., Shen, C., Morris, M. R., and Everitt, K. Experiences with and observations of direct-touch tabletops. In Proc. of TABLETOP '06, IEEE Computer Society (2006), 89--96. Google ScholarDigital Library
- Scott, S. D., Carpendale, S., and Inkpen, K. M. Territoriality in collaborative tabletop workspaces. In Proc. of CSCW '04, ACM (2004), 294--303. Google ScholarDigital Library
- Serim, B., and Jacucci, G. Pointing while looking elsewhere: Designing for varying degrees of visual guidance during manual input. In Proceedings of CHI '16, ACM (2016), 5789--5800. Google ScholarDigital Library
- Shen, C. From clicks to touches: Enabling face-to-face shared social interface on multi-touch tabletops. In Proc. of OCSC'07, Springer-Verlag (2007), 169--175. Google ScholarDigital Library
- Shen, C., Ryall, K., Forlines, C., Esenther, A., Vernier, F. D., Everitt, K., Wu, M., Wigdor, D., Morris, M. R., Hancock, M., and Tse, E. Informing the design of direct-touch tabletops. IEEE Comput. Graph. Appl. 26, 5 (Sept. 2006), 36--46. Google ScholarDigital Library
- Smeaton, A. F., Lee, H., Foley, C., and Mcgivney, S. Collaborative video searching on a tabletop. Multimedia Syst. 12, 4--5 (Mar. 2007), 375--391. Google ScholarDigital Library
- Stellmach, S., and Dachselt, R. Look & touch: Gaze-supported target acquisition. In Proc. of CHI '12, ACM (2012), 2981--2990. Google ScholarDigital Library
- Stellmach, S., and Dachselt, R. Still looking: Investigating seamless gaze-supported selection, positioning, and manipulation of distant targets. In Proc. of CHI '13, ACM (2013), 285--294. Google ScholarDigital Library
- Toney, A., and Thomas, B. H. Considering reach in tangible and table top design. In Proc. of TABLETOP '06, IEEE (2006), 2 pp. Google ScholarDigital Library
- Tse, E., Greenberg, S., Shen, C., and Forlines, C. Multimodal multiplayer tabletop gaming. Computers in Entertainment (CIE) 5, 2 (Apr. 2007). Google ScholarDigital Library
- Tse, E., Histon, J., Scott, S. D., and Greenberg, S. Avoiding interference: How people use spatial separation and partitioning in sdg workspaces. In Proc. of CSCW '04, ACM (2004), 252--261. Google ScholarDigital Library
- Turner, J., Alexander, J., Bulling, A., and Gellersen, H. Gaze+rst: Integrating gaze and multitouch for remote rotate-scale-translate tasks. In Proc. of CHI '15, ACM (2015), 4179--4188. Google ScholarDigital Library
- Velloso, E., Wirth, M., Weichel, C., Esteves, A., and Gellersen, H. AmbiGaze: Direct Control of Ambient Devices by Gaze. In Proc. of DIS'16, ACM (2016), 812--817. Google ScholarDigital Library
- Vidal, M., Bulling, A., and Gellersen, H. Pursuits: Spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proc. of UbiComp '13, ACM (2013), 439--448. Google ScholarDigital Library
- Voelker, S., Matviienko, A., Schöning, J., and Borchers, J. Combining direct and indirect touch input for interactive workspaces using gaze input. In Proc. of SUI '15, ACM (2015), 79--88. Google ScholarDigital Library
- Voelker, S., Weiss, M., Wacharamanotham, C., and Borchers, J. Dynamic portals: A lightweight metaphor for fast object transfer on interactive surfaces. In Proc. of ITS '11, ACM (2011), 158--161. Google ScholarDigital Library
- Wigdor, D., Jiang, H., Forlines, C., Borkin, M., and Shen, C. Wespace: The design development and deployment of a walk-up and share multi-surface visual collaboration system. In Proc. of CHI '09, ACM (2009), 1237--1246. Google ScholarDigital Library
- Yamamoto, M., Komeda, M., Nagamatsu, T., and Watanabe, T. Development of eye-tracking tabletop interface for media art works. In Proc. of ITS '10, ACM (2010), 295--296. Google ScholarDigital Library
- Yamamoto, M., Komeda, M., Nagamatsu, T., and Watanabe, T. Hyakunin-eyesshu: A tabletop hyakunin-isshu game with computer opponent by the action prediction based on gaze detection. In Proc. of NGCA '11, ACM (2011), 5:1--5:4. Google ScholarDigital Library
- Zhai, S., Morimoto, C., and Ihde, S. Manual and gaze input cascaded (magic) pointing. In Proc. of CHI '99, ACM (1999), 246--253. Google ScholarDigital Library
Index Terms
- Multimodal Segmentation on a Large Interactive Tabletop: Extending Interaction on Horizontal Surfaces with Gaze
Recommendations
Eye&Head: Synergetic Eye and Head Movement for Gaze Pointing and Selection
UIST '19: Proceedings of the 32nd Annual ACM Symposium on User Interface Software and TechnologyEye gaze involves the coordination of eye and head movement to acquire gaze targets, but existing approaches to gaze pointing are based on eye-tracking in abstraction from head motion. We propose to leverage the synergetic movement of eye and head, and ...
Recognizing Unintentional Touch on Interactive Tabletop
A multi-touch interactive tabletop is designed to embody the benefits of a digital computer within the familiar surface of a physical tabletop. However, the nature of current multi-touch tabletops to detect and react to all forms of touch, including ...
Eye, Head and Torso Coordination During Gaze Shifts in Virtual Reality
Humans perform gaze shifts naturally through a combination of eye, head and body movements. Although gaze has been long studied as input modality for interaction, this has previously ignored the coordination of the eyes, head and body. This article ...
Comments