ABSTRACT
As social virtual reality (VR) becomes more popular, avatars are being designed with realistic behaviors incorporating non-verbal cues like eye contact. However, perceiving eye contact during a conversation can be challenging for people with visual impairments. VR presents an opportunity to display eye contact cues in alternative ways, making them perceivable for people with visual impairments. We performed an exploratory study to gain initial insights on designing eye contact cues for people with visual impairments, including a focus group for a deeper understanding of the topic. We implemented eye contact cues via visual, auditory, and tactile sensory modalities in VR and tested these approaches with eleven participants with visual impairments and collected qualitative feedback. The results show that visual cues indicating the gaze direction were preferred, but auditory and tactile cues were also prevalent as they do not superimpose additional visual information.
Supplemental Material
- Yosuef Alotaibi, John H Williamson, and Stephen Brewster. 2020. Investigating electrotactile feedback on the hand. In 2020 IEEE Haptics Symposium (HAPTICS). IEEE, 637–642. https://doi.org/10.1109/HAPTICS45997.2020.ras.HAP20.13.8ee5dc37Google ScholarCross Ref
- Jeremy N. Bailenson, Andrew C. Beall, Jack Loomis, Jim Blascovich, and Matthew Turk. 2004. Transformed Social Interaction: Decoupling Representation from Behavior and Form in Collaborative Virtual Environments. Presence: Teleoperators and Virtual Environments 13, 4 (Aug. 2004), 428–441. https://doi.org/10.1162/1054746041944803Google ScholarDigital Library
- Jeremy N. Bailenson, Andrew C. Beall, Jack Loomis, Jim Blascovich, and Matthew Turk. 2005. Transformed Social Interaction, Augmented Gaze, and Social Influence in Immersive Virtual Environments. Human Communication Research 31, 4 (Oct. 2005), 511–537. https://doi.org/10.1111/j.1468-2958.2005.tb00881.xGoogle ScholarCross Ref
- C.S. Fichten, D. Judd, V. Tagalakis, R. Amsel, and K. Robillard. 1991. Communication Cues Used by People with and without Visual Impairments in Daily Conversations and Dating. Journal of Visual Impairment & Blindness 85, 9 (Nov. 1991), 371–378. https://doi.org/10.1177/0145482X9108500906Google ScholarCross Ref
- Mar Gonzalez Franco, Eyal Ofek, Ye Pan, Angus Antley, Anthony Steed, Bernhard Spanlang, Antonella Maselli, Domna Banakou, Nuria Pelechano, Sergio Orts-Escolano, Veronica Orvalho, Laura Trutoiu, Markus Wojcik, Maria V. Sanchez-Vives, Jeremy Bailenson, Mel Slater, and Jaron Lanier. 2020. The Rocketbox library and the utility of freely available rigged avatars. Frontiers in Virtual Reality (Nov. 2020). https://www.microsoft.com/en-us/research/publication/the-rocketbox-library-and-the-utility-of-freely-available-rigged-avatars-for-procedural-animation-of-virtual-humans-and-embodiment/Google Scholar
- Thomas Hermann, Andy Hunt, and John G Neuhoff. 2011. The sonification handbook. Logos Verlag Berlin.Google Scholar
- Tiger F Ji, Brianna Cochran, and Yuhang Zhao. 2022. VRBubble: Enhancing Peripheral Awareness of Avatars for People with Visual Impairments in Social Virtual Reality. In Proceedings of the 24th International ACM SIGACCESS Conference on Computers and Accessibility. 1–17.Google ScholarDigital Library
- Hiroyuki Kajimoto, Masaki Suzuki, and Yonezo Kanno. 2014. HamsaTouch: Tactile Vision Substitution with Smartphone and Electro-Tactile Display. In CHI ’14 Extended Abstracts on Human Factors in Computing Systems. 1273–1278. https://doi.org/10.1145/2559206.2581164Google ScholarDigital Library
- Oliver Beren Kaul, Michael Rohs, Marc Mogalle, and Benjamin Simon. 2021. Around-the-Head Tactile System for Supporting Micro Navigation of People with Visual Impairments. ACM Trans. Comput.-Hum. Interact. 28, 4 (July 2021). https://doi.org/10.1145/3458021Google ScholarDigital Library
- Chris L Kleinke. 1986. Gaze and eye contact: A research review.Psychological bulletin 100, 1 (1986), 78. https://doi.org/10.1037/0033-2909.100.1.78Google ScholarCross Ref
- Hiromi Kobayashi and Shiro Kohshima. 2001. Unique morphology of the human eye and its adaptive meaning: comparative studies on external morphology of the primate eye. Journal of Human Evolution 40, 5 (May 2001), 419–435. https://doi.org/10.1006/jhev.2001.0468Google ScholarCross Ref
- Sreekar Krishna, Shantanu Bala, Troy McDaniel, Stephen McGuire, and Sethuraman Panchanathan. 2010. VibroGlove: An Assistive Technology Aid for Conveying Facial Expressions. In CHI ’10 Extended Abstracts on Human Factors in Computing Systems. 3637–3642. https://doi.org/10.1145/1753846.1754031Google ScholarDigital Library
- Sreekar Krishna, Dirk Colbry, John Black, Vineeth Balasubramanian, and Sethuraman Panchanathan. 2008. A Systematic Requirements Analysis and Development of an Assistive Device to Enhance the Social Interaction of People Who are Blind or Visually Impaired. In Workshop on Computer Vision Applications for the Visually Impaired. James Coughlan and Roberto Manduchi, Marseille, France. https://hal.inria.fr/inria-00325432Google Scholar
- Florian Lang, Albrecht Schmidt, and Tonja Machulla. 2020. Augmented Reality for People with Low Vision: Symbolic and Alphanumeric Representation of Information. In Computers Helping People with Special Needs. Vol. 12376. 146–156. https://doi.org/10.1007/978-3-030-58796-3_19Google ScholarDigital Library
- Jingyi Li, Son Kim, Joshua A. Miele, Maneesh Agrawala, and Sean Follmer. 2019. Editing Spatial Layouts through Tactile Templates for People with Visual Impairments. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–11. https://doi.org/10.1145/3290605.3300436Google ScholarDigital Library
- Cecily Morrison, Edward Cutrell, Martin Grayson, Elisabeth RB Becker, Vasiliki Kladouchou, Linda Pring, Katherine Jones, Rita Faia Marques, Camilla Longden, and Abigail Sellen. 2021. Enabling Meaningful Use of AI-Infused Educational Technologies for Children with Blindness: Learnings from the Development and Piloting of the PeopleLens Curriculum. In Proceedings of the 23rd International ACM SIGACCESS Conference on Computers and Accessibility. https://doi.org/10.1145/3441852.3471210Google ScholarDigital Library
- Cecily Morrison, Ed Cutrell, Martin Grayson, Geert Roumen, Rita Faia Marques, Anja Thieme, Alex Taylor, and Abigail Sellen. 2021. PeopleLens. Interactions 28, 3 (April 2021), 10–13. https://doi.org/10.1145/3460116Google ScholarDigital Library
- Miles L. Patterson. 1982. A sequential functional model of nonverbal exchange.Psychological Review 89, 3 (1982), 231–249. https://doi.org/10.1037/0033-295X.89.3.231 Place: US Publisher: American Psychological Association.Google ScholarCross Ref
- Shi Qiu, Jun Hu, Ting Han, Hirotaka Osawa, and Matthias Rauterberg. 2020. An Evaluation of a Wearable Assistive Device for Augmenting Social Interactions. IEEE Access 8 (2020), 164661–164677. https://doi.org/10.1109/ACCESS.2020.3022425Google ScholarCross Ref
- Shi Qiu, Jun Hu, and Matthias Rauterberg. 2015. Nonverbal Signals for Face-to-Face Communication between the Blind and the Sighted. (2015), 10.Google Scholar
- Martin Rotard, Sven Knödler, and Thomas Ertl. 2005. A Tactile Web Browser for the Visually Disabled. In Proceedings of the Sixteenth ACM Conference on Hypertext and Hypermedia(HYPERTEXT ’05). 15–22. https://doi.org/10.1145/1083356.1083361Google ScholarDigital Library
- Daniel Roth, Constantin Klelnbeck, Tobias Feigl, Christopher Mutschler, and Marc Erich Latoschik. 2018. Beyond Replication: Augmenting Social Behaviors in Multi-User Virtual Realities. In 2018 IEEE Conference on Virtual Reality and 3D User Interfaces (VR). IEEE, 215–222. https://doi.org/10.1109/VR.2018.8447550Google ScholarCross Ref
- Michael Tomasello, Brian Hare, Hagen Lehmann, and Josep Call. 2007. Reliance on head versus eyes in the gaze following of great apes and human infants: the cooperative eye hypothesis. Journal of Human Evolution 52, 3 (March 2007), 314–320. https://doi.org/10.1016/j.jhevol.2006.10.001Google ScholarCross Ref
- M. A. Torres-Gil, O. Casanova-Gonzalez, and J. L. Gonzalez-Mora. 2010. Applications of Virtual Reality for Visually Impaired People. W. Trans. on Comp. 9, 2 (Feb. 2010), 184–193.Google Scholar
- Markus Wieland, Lauren Thevin, Albrecht Schmidt, and Tonja Machulla. 2022. Non-verbal Communication and Joint Attention Between People with and Without Visual Impairments: Deriving Guidelines for Inclusive Conversations in Virtual Realities. In Computers Helping People with Special Needs. Vol. 13341. 295–304. https://doi.org/10.1007/978-3-031-08648-9_34Google ScholarDigital Library
- Yuhang Zhao, Cynthia L. Bennett, Hrvoje Benko, Edward Cutrell, Christian Holz, Meredith Ringel Morris, and Mike Sinclair. 2018. Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 1–14. https://doi.org/10.1145/3173574.3173690Google ScholarDigital Library
- Yuhang Zhao, Edward Cutrell, Christian Holz, Meredith Ringel Morris, Eyal Ofek, and Andrew D. Wilson. 2019. SeeingVR: A Set of Tools to Make Virtual Reality More Accessible to People with Low Vision. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. 1–14. https://doi.org/10.1145/3290605.3300341Google ScholarDigital Library
Index Terms
- VR, Gaze, and Visual Impairment: An Exploratory Study of the Perception of Eye Contact across different Sensory Modalities for People with Visual Impairments in Virtual Reality
Recommendations
VRBubble: Enhancing Peripheral Awareness of Avatars for People with Visual Impairments in Social Virtual Reality
ASSETS '22: Proceedings of the 24th International ACM SIGACCESS Conference on Computers and AccessibilitySocial Virtual Reality (VR) is growing for remote socialization and collaboration. However, current social VR applications are not accessible to people with visual impairments (PVI) due to their focus on visual experiences. We aim to facilitate social ...
Enabling People with Visual Impairments to Navigate Virtual Reality with a Haptic and Auditory Cane Simulation
CHI '18: Proceedings of the 2018 CHI Conference on Human Factors in Computing SystemsTraditional virtual reality (VR) mainly focuses on visual feedback, which is not accessible for people with visual impairments. We created Canetroller, a haptic cane controller that simulates white cane interactions, enabling people with visual ...
The challenges in adopting assistive technologies in the workplace for people with visual impairments
OzCHI '18: Proceedings of the 30th Australian Conference on Computer-Human InteractionThere are many barriers to employment for people with visual impairments. Assistive technologies (ATs), such as computer screen readers and enlarging software, are commonly used to help overcome employment barriers and enable people with visual ...
Comments