skip to main content
10.1145/3204949.3208139acmconferencesArticle/Chapter ViewAbstractPublication PagesmmsysConference Proceedingsconference-collections
research-article

A dataset of head and eye movements for 360° videos

Published:12 June 2018Publication History

ABSTRACT

Research on visual attention in 360° content is crucial to understand how people perceive and interact with this immersive type of content and to develop efficient techniques for processing, encoding, delivering and rendering. And also to offer a high quality of experience to end users. The availability of public datasets is essential to support and facilitate research activities of the community. Recently, some studies have been presented analyzing exploration behaviors of people watching 360° videos, and a few datasets have been published. However, the majority of these works only consider head movements as proxy for gaze data, despite the importance of eye movements in the exploration of omnidirectional content. Thus, this paper presents a novel dataset of 360° videos with associated eye and head movement data, which is a follow-up to our previous dataset for still images [14]. Head and eye tracking data was obtained from 57 participants during a free-viewing experiment with 19 videos. In addition, guidelines on how to obtain saliency maps and scanpaths from raw data are provided. Also, some statistics related to exploration behaviors are presented, such as the impact of the longitudinal starting position when watching omnidirectional videos was investigated in this test. This dataset and its associated code are made publicly available to support research on visual attention for 360° content.

References

  1. Zoya Bylinskii, Tilke Judd, Aude Oliva, Antonio Torralba, and Frédo Durand. 2016. What do different evaluation metrics tell us about saliency models? arXiv preprint arXiv:1604.03605 (2016).Google ScholarGoogle Scholar
  2. Xavier Corbillon, Francesca De Simone, and Gwendal Simon. 2017. 360-Degree Video Head Movement Dataset. Proceedings of the 8th ACM on Multimedia Systems Conference - MMSys'17 (June 2017), 199--204. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Xavier Corbillon, Gwendal Simon, Alisa Devlic, and Jacob Chakareski. 2017. Viewport-adaptive navigable 360-degree video delivery. IEEE International Conference on Communications (2017).Google ScholarGoogle ScholarCross RefCross Ref
  4. Yu Fang, Ryoichi Nakashima, Kazumichi Matsumiya, Ichiro Kuriki, and Satoshi Shioiri. 2015. Eye-head coordination for visual cognitive processing. PloS one 10, 3 (2015), e0121035.Google ScholarGoogle ScholarCross RefCross Ref
  5. Brian Hu, Ishmael Johnson-Bey, Mansi Sharma, and Ernst Niebur. 2017. Head movements during visual exploration of natural images in virtual reality. In 2017 51st Annual Conference on Information Sciences and Systems (CISS). 1--6.Google ScholarGoogle ScholarCross RefCross Ref
  6. ITU. 2008. Subjective video quality assessment methods for multimedia applications. (April 2008).Google ScholarGoogle Scholar
  7. Halszka Jarodzka, Kenneth Holmqvist, and Marcus Nyström. 2010. A vector-based, multidimensional scanpath similarity measure. In Proceedings of the 2010 symposium on eye-tracking research & applications. ACM, 211--218. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Eileen Kowler. 2011. Eye movements: The past 25 years. Vision research 51, 13 (2011), 1457--1483.Google ScholarGoogle Scholar
  9. Benjamin J. Li, Jeremy N. Bailenson, Adam Pines, Walter J. Greenleaf, and Leanne M. Williams. 2017. A Public Database of Immersive VR Videos with Corresponding Ratings of Arousal, Valence, and Correlations between Head Movements and Self Report Measures. Frontiers in Psychology 8, DEC (Dec. 2017).Google ScholarGoogle Scholar
  10. Wen-Chih Lo, Ching-Ling Fan, Jean Lee, Chun-Ying Huang, Kuan-Ta Chen, and Cheng-Hsin Hsu. 2017. 360Âř Video Viewing Dataset in Head-Mounted Virtual Reality. Proceedings of the 8th ACM on Multimedia Systems Conference - MMSys'17 (June 2017), 211--216. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. G. Marmitt and AT. T Duchowski. 2002. Modeling visual attention in vr: Measuring the accuracy of predicted scanpaths. In Eurographics 2002, Short Presentations. Saarbrücken, Germany, 217--226.Google ScholarGoogle Scholar
  12. Gerd Marmitt and Andrew T Duchowski. 2002. Modeling visual attention in VR: Measuring the accuracy of predicted scanpaths. Ph.D. Dissertation. Clemson University.Google ScholarGoogle Scholar
  13. Margaret H. Pinson, Lark Kwon Choi, and Alan Conrad Bovik. 2014. Temporal Video Quality Model Accounting for Variable Frame Delay Distortions. IEEE Transactions on Broadcasting 60, 4 (Dec. 2014), 637--649.Google ScholarGoogle ScholarCross RefCross Ref
  14. Yashas Rai, Jesús Gutiérrez, and Patrick Le Callet. 2017. A dataset of head and eye movements for 360 degree images. In Proceedings of the 8th ACM Multimedia Systems Conference, MMSys 2017. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Yashas Rai, Patrick Le Callet, and Gene Cheung. 2016. Quantifying the relation between perceived interest and visual salience during free viewing using trellis based optimization. In Image, Video, and Multidimensional Signal Processing Workshop (IVMSP), 2016 IEEE 12th. IEEE, 1--5.Google ScholarGoogle Scholar
  16. Yashas Rai, Patrick Le Callet, and Philippe Guillotel. 2017. Which saliency weighting for omni directional image quality assessment?. In Quality of Multimedia Experience (QoMEX), 2017 Ninth International Conference on. IEEE, 1--6.Google ScholarGoogle ScholarCross RefCross Ref
  17. Salient360. 2018. Special Issue. Signal Processing: Image Communication (2018). To appear.Google ScholarGoogle Scholar
  18. Dario D Salvucci and Joseph H Goldberg. 2000. Identifying fixations and saccades in eye-tracking protocols. In Proceedings of the 2000 symposium on Eye tracking research & applications. ACM, 71--78. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Ana Serrano, Vincent Sitzmann, Jaime Ruiz-Borau, Gordon Wetzstein, Diego Gutierrez, and Belen Masia. 2017. Movie editing and cognitive event segmentation in virtual reality video. ACM Transactions on Graphics (TOG) 36, 4 (2017), 47. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Vincent Sitzmann, Ana Serrano, Amy Pavel, Maneesh Agrawala, Diego Gutierrez, Belen Masia, and Gordon Wetzstein. 2018. Saliency in VR: How do people explore virtual environments? IEEE Transactions on Visualization and Computer Graphics (2018), 1--1. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Yu-Chuan Su, Dinesh Jayaraman, and Kristen Grauman. 2016. Pano2Vid: Automatic Cinematography for Watching 360° Videos. 1 (Dec. 2016).Google ScholarGoogle Scholar
  22. Evgeniy Upenik and Touradj Ebrahimi. 2017. A simple method to obtain visual attention data in head mounted virtual reality. In 2017 IEEE International Conference on Multimedia & Expo Workshops (ICMEW). 73--78.Google ScholarGoogle ScholarCross RefCross Ref
  23. Chenglei Wu, Zhihao Tan, Zhi Wang, and Shiqiang Yang. 2017. A Dataset for Exploring User Behaviors in VR Spherical Video Streaming. Proceedings of the 8th ACM on Multimedia Systems Conference - MMSys'17 (June 2017), 193--198. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Matt Yu, Haricharan Lakshman, and Bernd Girod. 2015. A Framework to Evaluate Omnidirectional Video Coding Schemes. In 2015 IEEE International Symposium on Mixed and Augmented Reality. 31--36. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. A dataset of head and eye movements for 360° videos

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        MMSys '18: Proceedings of the 9th ACM Multimedia Systems Conference
        June 2018
        604 pages
        ISBN:9781450351928
        DOI:10.1145/3204949
        • General Chair:
        • Pablo Cesar,
        • Program Chairs:
        • Michael Zink,
        • Niall Murray

        Copyright © 2018 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 12 June 2018

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • research-article

        Acceptance Rates

        Overall Acceptance Rate176of530submissions,33%

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader