skip to main content
10.1145/3365610.3365636acmotherconferencesArticle/Chapter ViewAbstractPublication PagesmumConference Proceedingsconference-collections
research-article

Understanding pointing for workspace tasks on large high-resolution displays

Published:26 November 2019Publication History

ABSTRACT

Navigating on large high-resolution displays (LHRDs) using devices built for traditional desktop computers can be strenuous and negatively impact user experience. As LHRDs transition to everyday use, new user-friendly interaction techniques need to be designed to capitalise on the potential offered by the abundant screen space on LHRDs. We conducted a study which compared mouse pointing and eye-tracker assisted pointing (MAGIC pointing) on LHRDs. In a controlled experiment with 35 participants, we investigated user performance in a one-dimensional pointing task and a map-based search task. We determined that MAGIC pointing had a lower throughput, but participants had the perception of higher performance. Our work contributes insights for the design of pointing techniques for LHRDs. The results indicate that the choice of technique is scenario-dependent which contrasts with desktop computers.

References

  1. Christopher Andrews and Chris North. 2013. The Impact of Physical Navigation on Spatial Organization for Sensemaking. IEEE Transactions on Visualization and Computer Graphics 19, 12 (Dec 2013), 2207--2216. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Robert Ball and Chris North. 2007. Realizing embodied interaction for visual analytics through large displays. Computers & Graphics 31, 3 (2007), 380--400. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Robert Ball, Chris North, Chris North, and Doug A. Bowman. 2007. Move to Improve: Promoting Physical Navigation to Increase User Performance with Large Displays. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, New York, NY, USA, 191--200. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Richard A. Bolt. 1982. Eyes at the Interface. In Proceedings of the 1982 Conference on Human Factors in Computing Systems (CHI '82). ACM, New York, NY, USA, 360--362. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Olivier Chapuis, Anastasia Bezerianos, and Stelios Frantzeskakis. 2014. Smarties: An Input System for Wall Display Development. In Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 2763--2772. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Mary Czerwinski, Greg Smith, Tim Regan, Brian Meyers, George G Robertson, and Gary K Starkweather. 2003. Toward characterizing the productivity benefits of very large displays. In Interact, Vol. 3. 9--16.Google ScholarGoogle Scholar
  7. Connor Dickie, Jamie Hart, Roel Vertegaal, and Alex Eiser. 2006. LookPoint: An Evaluation of Eye Input for Hands-free Switching of Input Devices Between Multiple Computers. In Proceedings of the 18th Australia Conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments (OzCHI '06). ACM, New York, NY, USA, 119--126. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Heiko Drewes and Albrecht Schmidt. 2009. The MAGIC Touch: Combining MAGIC-Pointing with a Touch-Sensitive Mouse. Springer Berlin Heidelberg, Berlin, Heidelberg, 415--428. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Andrey Esakia, Alex Endert, and Chris North. 2014. Large display interaction via multiple acceleration curves and multifinger pointer control. Advances in Human-Computer Interaction 2014 (2014), 12.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Ribel Fares, Shaomin Fang, and Oleg Komogortsev. 2013. Can We Beat the Mouse with MAGIC?. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 1387--1390. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Paul M Fitts. 1954. The information capacity of the human motor system in controlling the amplitude of movement. Journal of experimental psychology 47, 6 (1954), 381. Google ScholarGoogle ScholarCross RefCross Ref
  12. David Fono and Roel Vertegaal. 2005. EyeWindows: Evaluation of Eye-controlled Zooming Windows for Focus Selection. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '05). ACM, New York, NY, USA, 151--160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Florian Fortmann, Dennis Nowak, Kristian Bruns, Mark Milster, and Susanne Boll. 2015. Assisting Mouse Pointer Recovery in Multi-Display Environments. Mensch und Computer 2015-Proceedings (2015).Google ScholarGoogle Scholar
  14. Jonathan Grudin. 2001. Partitioning Digital Worlds: Focal and Peripheral Awareness in Multiple Monitor Use. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '01). ACM, New York, NY, USA, 458--465. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Faizan Haque, Mathieu Nancel, and Daniel Vogel. 2015. Myopoint: Pointing and Clicking Using Forearm Mounted Electromyography and Inertial Motion Sensors. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 3653--3656. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Sandra G. Hart. 2006. Nasa-Task Load Index (NASA-TLX); 20 Years Later. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 50, 9 (2006), 904--908. arXiv:http://dx.doi.org/10.1177/154193120605000909 Google ScholarGoogle ScholarCross RefCross Ref
  17. Sandra G. Hart and Lowell E. Staveland. 1988. Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. Advances in Psychology 52 (1988), 139--183. Google ScholarGoogle ScholarCross RefCross Ref
  18. Kasper Hornbæk. 2013. Some Whys and Hows of Experiments in Human-Computer Interaction. Foundations and Trends in Human-Computer Interaction 5, 4 (2013), 299--373. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. ISO/TS 9241-411:2012 2012. Ergonomics of human-system interaction - Part 411: Evaluation methods for the design of physical input devices. Standard. International Organization for Standardization, Geneva, CH.Google ScholarGoogle Scholar
  20. Mikkel R. Jakobsen and Kasper Hornbæk. 2016. Negotiating for Space?: Collaborative Work Using a Wall Display with Mouse and Touch Input. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 2050--2061. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Shahram Jalaliniya, Diako Mardanbegi, and Thomas Pederson. 2015. MAGIC Pointing for Eyewear Computers. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC '15). ACM, New York, NY, USA, 155--158. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Moritz Kassner, William Patera, and Andreas Bulling. 2014. Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-based Interaction. In Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication (UbiComp '14 Adjunct). ACM, New York, NY, USA, 1151--1160. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Masatomo Kobayashi and Takeo Igarashi. 2008. Ninja Cursors: Using Multiple Cursors to Assist Target Acquisition on Large Screens. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 949--958. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Kai Kuikkaniemi, Max Vilkki, Jouni Ojala, Matti Nelimarkka, and Giulio Jacucci. 2013. Introducing Kupla UI: A Generic Interactive Wall User Interface Based on Physics Modeled Spherical Content Widgets. In Proceedings of the 2013 ACM International Conference on Interactive Tabletops and Surfaces (ITS '13). ACM, New York, NY, USA, 301--304. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Manu Kumar, Andreas Paepcke, Terry Winograd, and Terry Winograd. 2007. Eye-Point: Practical Pointing and Selection Using Gaze and Keyboard. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, New York, NY, USA, 421--430. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Lars Lischke, Jan Hoffmann, Robert Krüger, Patrick Bader, Paweł W. Wozniak, and Albrecht Schmidt. 2017. Towards Interaction Techniques for Social Media Data Exploration on Large High-Resolution Displays. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '17). ACM, New York, NY, USA, 2752--2759. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Lars Lischke, Valentin Schwind, Kai Friedrich, Albrecht Schmidt, and Niels Henze. 2016. MAGIC-Pointing on Large High-Resolution Displays. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '16). ACM, New York, NY, USA, 1706--1712. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Can Liu, Olivier Chapuis, Michel Beaudouin-Lafon, Eric Lecolinet, and Wendy E. Mackay. 2014. Effects of Display Size and Navigation Type on a Classification Task. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 4147--4156. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. I. Scott MacKenzie, Abigail Sellen, and William A. S. Buxton. 1991. A Comparison of Input Devices in Element Pointing and Dragging Tasks. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '91). ACM, New York, NY, USA, 161--166. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Scott I. MacKenzie. 1992. Movement time prediction in human-computer interfaces. In Proceedings of Graphics Interface (GI'92), Vol. 92. 1.Google ScholarGoogle Scholar
  31. Mathieu Nancel, Olivier Chapuis, Emmanuel Pietriga, Xing-Dong Yang, Pourang P. Irani, and Michel Beaudouin-Lafon. 2013. High-precision Pointing on Large Wall Displays Using Small Handheld Devices. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 831--840. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. Mathieu Nancel, Emmanuel Pietriga, Olivier Chapuis, and Michel Beaudouin-Lafon. 2015. Mid-Air Pointing on Ultra-Walls. ACM Trans. Comput.-Hum. Interact. 22, 5, Article 21 (Aug. 2015), 62 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. Peter Peltonen, Esko Kurvinen, Antti Salovaara, Giulio Jacucci, Tommi Ilmonen, John Evans, Antti Oulasvirta, and Petri Saarikko. 2008. It's Mine, Don'T Touch!: Interactions at a Large Multi-touch Display in a City Centre. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08). ACM, New York, NY, USA, 1285--1294. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. Fateme Rajabiyazdi, Jagoda Walny, Carrie Mah, John Brosz, and Sheelagh Carpendale. 2015. Understanding Researchers' Use of a Large, High-Resolution Display Across Disciplines. In Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces (ITS '15). ACM, New York, NY, USA, 107--116. Google ScholarGoogle ScholarDigital LibraryDigital Library
  35. Umar Rashid, Miguel A. Nacenta, and Aaron Quigley. 2012. Factors Influencing Visual Attention Switch in Multi-display User Interfaces: A Survey. In Proceedings of the 2012 International Symposium on Pervasive Displays (PerDis '12). ACM, New York, NY, USA, Article 1, 6 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. Ramesh Raskar, Greg Welch, Matt Cutts, Adam Lake, Lev Stesin, and Henry Fuchs. 1998. The Office of the Future: A Unified Approach to Image-based Modeling and Spatially Immersive Displays. In Proceedings of the 25th Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH '98). ACM, New York, NY, USA, 179--188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  37. George Robertson, Mary Czerwinski, Patrick Baudisch, Brian Meyers, Daniel Robbins, Greg Smith, and Desney Tan. 2005. The Large-Display User Experience. IEEE Comput. Graph. Appl. 25, 4 (July 2005), 44--51. Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Farzan Sasangohar, I. Scott MacKenzie, and Stacey D. Scott. 2009. Evaluation of Mouse and Touch Input for a Tabletop Display Using Fitts' Reciprocal Tapping Task. Proceedings of the Human Factors and Ergonomics Society Annual Meeting 53, 12 (2009), 839--843. arXiv:http://dx.doi.org/10.1177/154193120905301216 Google ScholarGoogle ScholarCross RefCross Ref
  39. Baris Serim and Giulio Jacucci. 2016. Pointing While Looking Elsewhere: Designing for Varying Degrees of Visual Guidance During Manual Input. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI '16). ACM, New York, NY, USA, 5789--5800. Google ScholarGoogle ScholarDigital LibraryDigital Library
  40. Sophie Stellmach and Raimund Dachselt. 2012. Look & Touch: Gaze-supported Target Acquisition. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '12). ACM, New York, NY, USA, 2981--2990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  41. Jayson Turner, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2015. Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 4179--4188. Google ScholarGoogle ScholarDigital LibraryDigital Library
  42. Simon Voelker, Andrii Matviienko, Johannes Schöning, and Jan Borchers. 2015. Combining Direct and Indirect Touch Input for Interactive Workspaces Using Gaze Input. In Proceedings of the 3rd ACM Symposium on Spatial User Interaction (SUI '15). ACM, New York, NY, USA, 79--88. Google ScholarGoogle ScholarDigital LibraryDigital Library
  43. Daniel Vogel and Ravin Balakrishnan. 2005. Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays. In Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology (UIST '05). ACM, New York, NY, USA, 33--42. Google ScholarGoogle ScholarDigital LibraryDigital Library
  44. Ulrich von Zadow, Daniel Bösel, Duc Dung Dam, Anke Lehmann, Patrick Reipschläger, and Raimund Dachselt. 2016. Miners: Communication and Awareness in Collaborative Gaming at an Interactive Display Wall. In Proceedings of the 2016 ACM on Interactive Surfaces and Spaces (ISS '16). ACM, New York, NY, USA, 235--240. Google ScholarGoogle ScholarDigital LibraryDigital Library
  45. Ulrich von Zadow, Wolfgang Büschel, Ricardo Langner, and Raimund Dachselt. 2014. SleeD: Using a Sleeve Display to Interact with Touch-sensitive Display Walls. In Proceedings of the Ninth ACM International Conference on Interactive Tabletops and Surfaces (ITS '14). ACM, New York, NY, USA, 129--138. Google ScholarGoogle ScholarDigital LibraryDigital Library
  46. Markus L. Wittorf and Mikkel R. Jakobsen. 2016. Eliciting Mid-Air Gestures for Wall-Display Interaction. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction (NordiCHI '16). ACM, New York, NY, USA, Article 3, 4 pages. Google ScholarGoogle ScholarDigital LibraryDigital Library
  47. ByungIn Yoo, Jae-Joon Han, Changkyu Choi, Kwonju Yi, Sungjoo Suh, Dusik Park, and Changyeong Kim. 2010. 3D User Interface Combining Gaze and Hand Gestures for Large-scale Display. In CHI '10 Extended Abstracts on Human Factors in Computing Systems (CHI EA '10). ACM, New York, NY, USA, 3709--3714. Google ScholarGoogle ScholarDigital LibraryDigital Library
  48. Shumin Zhai, Carlos Morimoto, and Steven Ihde. 1999. Manual and Gaze Input Cascaded (MAGIC) Pointing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '99). ACM, New York, NY, USA, 246--253. Google ScholarGoogle ScholarDigital LibraryDigital Library
  49. Xuan Zhang and I. Scott MacKenzie. 2007. Evaluating Eye Tracking with ISO 9241 - Part 9. Springer Berlin Heidelberg, Berlin, Heidelberg, 779--788. Google ScholarGoogle ScholarCross RefCross Ref
  50. Yanxia Zhang, Ken Pfeuffer, Ming Ki Chong, Jason Alexander, Andreas Bulling, and Hans Gellersen. 2017. Look together: using gaze for assisting co-located collaborative search. Personal and Ubiquitous Computing 21, 1 (01 Feb 2017), 173--186. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Understanding pointing for workspace tasks on large high-resolution displays

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Other conferences
        MUM '19: Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia
        November 2019
        462 pages
        ISBN:9781450376242
        DOI:10.1145/3365610

        Copyright © 2019 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 26 November 2019

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate190of465submissions,41%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader