Abstract
Tactile patterns are a means to convey navigation instructions to pedestrians and are especially helpful for people with visual impairments. This article presents a concept to provide precise micro-navigation instructions through a tactile around-the-head display. Our system presents four tactile patterns for fundamental navigation instructions in conjunction with continuous directional guidance. We followed an iterative, user-centric approach to design the patterns for the fundamental navigation instructions, combined them with a continuous directional guidance stimulus, and tested our system with 13 sighted (blindfolded) and 2 blind participants in an obstacle course, including stairs. We optimized the patterns and validated the final prototype with another five blind participants in a follow-up study. The system steered our participants successfully with a 5.7 cm average absolute deviation from the optimal path. Our guidance is only a little less precise than the usual shoulder wobbling during normal walking and an order of magnitude more precise than previous tactile navigation systems. Our system allows various new use cases of micro-navigation for people with visual impairments, e.g., preventing collisions on a sidewalk or as an anti-veering tool. It also has applications in other areas, such as personnel working in low-vision environments (e.g., firefighters).
- Anonymous. 2020. Pigpio library. Retrieved from http://abyz.me.uk/rpi/pigpio/.Google Scholar
- Apple Inc. 2020. Apple ARKit. Retrieved from https://developer.apple.com/augmented-reality/arkit/.Google Scholar
- Paul Bach-y Rita, Carter C. Collins, Frank A. Saunders, Benjamin White, and Lawrence Scadden. 1969. Vision substitution by tactile image projection. Nature 221, 5184 (Mar. 1969), 963–964. DOI:https://doi.org/10.1038/221963a0Google ScholarCross Ref
- P. Bach-y Rita, K. A. Kaczmarek, M. E. Tyler, and J. Garcia-Lara. 1998. Form perception with a 49-point electrotactile stimulus array on the tongue: a technical note.Journal of Rehabilitation Research and Development 35, 4 (1998), 427–430.Google Scholar
- Matthias Berning, Florian Braun, Till Riedel, and Michael Beigl. 2015. ProximityHat: a head-worn system for subtle sensory augmentation with tactile stimulation. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC’15). ACM Press, New York, 31–38. DOI:https://doi.org/10.1145/2802083.2802088Google ScholarDigital Library
- Lorna M. Brown, Stephen A. Brewster, and Helen C. Purchase. 2005. A first investigation into the effectiveness of Tactons. In Proceedings of the 1st Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; World Haptics Conference (WHC’05). IEEE, 167–176. DOI:https://doi.org/10.1109/WHC.2005.6Google ScholarDigital Library
- Leandro Cancar, Alex Díaz, Antonio Barrientos, David Travieso, and David M. Jacobs. 2013. Tactile-sight: A sensory substitution device based on distance-related vibrotactile flow. International Journal of Advanced Robotic Systems 10, 6 (Jun. 2013), 272. DOI:https://doi.org/10.5772/56235Google ScholarCross Ref
- Alvaro Cassinelli, Carson Reynolds, and Masatoshi Ishikawa. 2006. Augmenting spatial awareness with haptic radar. In Proceedings of the 2006 10th IEEE International Symposium on Wearable Computers. IEEE, 61–64. DOI:https://doi.org/10.1109/ISWC.2006.286344Google ScholarCross Ref
- Akansel Cosgun, E. Akin Sisbot, and Henrik I. Christensen. 2014. Evaluation of rotational and directional vibration patterns on a tactile belt for guiding visually impaired people. In Proceedings of the 2014 IEEE Haptics Symposium (HAPTICS’14). IEEE, 367–370. DOI:https://doi.org/10.1109/HAPTICS.2014.6775483Google ScholarCross Ref
- Ádám Csapó, György Wersényi, Hunor Nagy, and Tony Stockman. 2015. A survey of assistive technologies and applications for blind users on mobile platforms: a review and foundation for research. Journal on Multimodal User Interfaces 9, 4 (Dec. 2015), 275–286. DOI:https://doi.org/10.1007/s12193-015-0182-7Google ScholarCross Ref
- Victor Adriel de Jesus Oliveira, Luca Brayda, Luciana Nedel, and Anderson Maciel. 2017. Designing a vibrotactile head-mounted display for spatial awareness in 3D spaces. IEEE Transactions on Visualization and Computer Graphics 23, 4 (Apr. 2017), 1409–1417. DOI:https://doi.org/10.1109/TVCG.2017.2657238Google ScholarDigital Library
- Victor Adriel de Jesus Oliveira, Luciana Nedel, Anderson Maciel, and Luca Brayda. 2018. Anti-veering vibrotactile HMD for assistance of blind pedestrians. In Proceedings of the EuroHaptics 2018. Domenico Prattichizzo, Hiroyuki Shinoda, Hong Z. Tan, Emanuele Ruffaldi, and Antonio Frisoli (Eds.), Lecture Notes in Computer Science, Vol. 10894. Springer International Publishing, 500–512. DOI:https://doi.org/10.1007/978-3-319-93399-3_43Google ScholarCross Ref
- Deutsches Institut für Normung. 2015. DIN 18065. Retrieved from https://www.din.de/en/getting-involved/standards-committees/ndr/standards/wdc-beuth:din21:227410112.Google Scholar
- Vincent Diener, Michael Beigl, Matthias Budde, and Erik Pescara. 2017. VibrationCap: studying vibrotactile localization on the human head with an unobtrusive wearable tactile display. In Proceedings of the International Symposium on Wearable Computers. 82–89. DOI:https://doi.org/10.1145/3123021.3123047Google ScholarDigital Library
- Michal Karol Dobrzynski, Seifeddine Mejri, Steffen Wischmann, and Dario Floreano. 2012. Quantifying information transfer through a head-attached vibrotactile display: Principles for design and control. IEEE Transactions on Biomedical Engineering 59, 7 (Jul. 2012), 2011–2018. DOI:https://doi.org/10.1109/TBME.2012.2196433Google ScholarCross Ref
- German Flores, Sri Kurniawan, Roberto Manduchi, Eric Martinson, Lourdes M. Morales, and Emrah Akin Sisbot. 2015. Vibrotactile guidance for wayfinding of blind walkers. IEEE Transactions on Haptics 8, 3 (Jul. 2015), 306–317. DOI:https://doi.org/10.1109/TOH.2015.2409980Google ScholarDigital Library
- Google. 2020. Google ARCore. Retrieved from https://developers.google.com/ar.Google Scholar
- Guiding Eyes for the Blind. 2020. How many people use guide dogs? Retrieved from https://www.guidingeyes.org/about/faqs/.Google Scholar
- Marion Hersh. 2015. Cane use and late onset visual impairment. Technology and Disability 27, 3 (2015), 103–116. DOI:https://doi.org/10.3233/TAD-150432Google ScholarCross Ref
- Wilko Heuten, Niels Henze, Susanne Boll, and Martin Pielot. 2008. Tactile wayfinder: A non-visual support system for wayfinding. In Proceedings of the 5th Nordic Conference on Human-Computer Interaction Building Bridges (NordiCHI’08). ACM Press, New York, 172. DOI:https://doi.org/10.1145/1463160.1463179Google ScholarDigital Library
- Weijian Hu, Kaiwei Wang, Kailun Yang, Ruiqi Cheng, Yaozu Ye, Lei Sun, and Zhijie Xu. 2020. A comparative study in real-time scene sonification for visually impaired people. Sensors (Switzerland) 20, 11 (2020), 1–17. DOI:https://doi.org/10.3390/s20113222Google ScholarCross Ref
- Ali Israr and Ivan Poupyrev. 2011. Tactile brush. In Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems. ACM Press, New York, 2019. DOI:https://doi.org/10.1145/1978942.1979235Google ScholarDigital Library
- Lynette A. Jones and Nadine B. Sarter. 2008. Tactile displays: Guidance for their design and application. Human Factors 50, 1 (Feb. 2008), 90–111. DOI:https://doi.org/10.1518/001872008X250638Google ScholarCross Ref
- H. Kajimoto, Y. Kanno, and S. Tachi. 2006. Forehead electro-tactile display for vision substitution. In Proceedings of the EuroHaptics. Retrieved from http://lsc.univ-evry.fr/~eurohaptics/upload/cd/papers/f62.pdf%5Cnpap ers2://publication/uuid/DA67D1A2-09FE-47CF-BC2D-F12B7393E338.Google Scholar
- Brian F. G. Katz, Slim Kammoun, Gaëtan Parseihian, Olivier Gutierrez, Adrien Brilhault, Malika Auvray, Philippe Truillet, Michel Denis, Simon Thorpe, and Christophe Jouffrais. 2012. NAVIG: augmented reality guidance system for the visually impaired. Virtual Reality 16, 4 (Nov. 2012), 253–269. DOI:https://doi.org/10.1007/s10055-012-0213-6Google ScholarDigital Library
- Oliver Beren Kaul, Max Pfeiffer, and Michael Rohs. 2016. Follow the force: Steering the index finger towards targets using EMS. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, New York, NY, 2526–2532. DOI:https://doi.org/10.1145/2851581.2892352Google ScholarDigital Library
- Oliver Beren Kaul and Michael Rohs. 2016. HapticHead: 3D guidance and target acquisition through a vibrotactile grid. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, New York, NY, 2533–2539. DOI:https://doi.org/10.1145/2851581.2892355Google ScholarDigital Library
- Oliver Beren Kaul and Michael Rohs. 2017. HapticHead: A spherical vibrotactile grid around the head for 3D guidance in virtual and augmented reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI’17). ACM Press, New York, NY, 3729–3740. DOI:https://doi.org/10.1145/3025453.3025684Google ScholarDigital Library
- Oliver Beren Kaul, Michael Rohs, and Marc Mogalle. 2020. Design and evaluation of on-the-head spatial tactile patterns. In Proceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia. ACM, New York, NY, 229–239. DOI:https://doi.org/10.1145/3428361.3428407Google ScholarDigital Library
- Oliver Beren Kaul, Michael Rohs, Benjamin Simon, Kerem Can Demir, and Kamillo Ferry. 2020. Vibrotactile funneling illusion and localization performance on the head. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, 1–13. DOI:https://doi.org/10.1145/3313831.3376335Google ScholarDigital Library
- Hamideh Kerdegari, Yeongmi Kim, and Tony J. Prescott. 2016. Head-mounted sensory augmentation device: comparing haptic and audio modality. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 9793. Springer International Publishing, 107–118. DOI:https://doi.org/10.1007/978-3-319-42417-0_11Google ScholarCross Ref
- Lofelt GmbH. 2019. Elevating Haptic Technology with Lofelt Wave. Retrieved from https://lofelt.com/white-paper.Google Scholar
- Jack M. Loomis, Reginald G. Golledge, Roberta L. Klatzky, James R. Marston, and Gary L. Ed Allen. 2007. Assisting wayfinding in visually impaired travelers. Applied Spatial Cognition From Research to Cognitive Technology 1, 60587 (2007), 179–202.Google Scholar
- Ethan Luckett. 2018. A Quantitative Evaluation of the HTC Vive for Virtual Reality Research. Ph.D. Dissertation. University of Mississippi.Google Scholar
- Manuel Martinez, Alina Roitberg, Daniel Koester, Rainer Stiefelhagen, and Boris Schauerte. 2017. Using technology developed for autonomous cars to help navigate blind people. Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops (ICCVW’17). 1424–1432. DOI:https://doi.org/10.1109/ICCVW.2017.169Google ScholarCross Ref
- Manuel Martinez, Kailun Yang, Angela Constantinescu, and Rainer Stiefelhagen. 2020. Helping the blind to get through COVID-19: Social distancing assistant using real-time semantic segmentation on rgb-d video. Sensors (Switzerland) 20, 18 (2020), 1–17. DOI:https://doi.org/10.3390/s20185202Google ScholarCross Ref
- Akira Matsuda, Kazunori Nozawa, Kazuki Takata, Atsushi Izumihara, and Jun Rekimoto. 2020. HapticPointer: A neck-worn device that presents direction by vibrotactile feedback for remote collaboration tasks. In Proceedings of the Augmented Humans International Conference. ACM, New York, NY, 1–10. DOI:https://doi.org/10.1145/3384657.3384777Google ScholarDigital Library
- Renato Melo, Polyanna Amorim da Silva, Robson Souza, Maria Raposo, and Karla Ferraz. 2013. Head position comparison between students with normal hearing and students with sensorineural hearing loss. International Archives of Otorhinolaryngology 17, 04 (Sep. 2013), 363–369. DOI:https://doi.org/10.1055/s-0033-1351685Google ScholarCross Ref
- Kimberly Myles and Joel T. Kalb. 2009. Vibrotactile Sensitivity of the Head. Defense Technical Information Center. https://apps.dtic.mil/sti/citations/ADA499558.Google Scholar
- Kimberly Myles and Joel T. Kalb. 2010. Guidelines for head tactile communication. Army Research Laboratory 1, March (2010), 26. Retrieved from http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA519112.Google Scholar
- K. Myles and J. T. Kalb. 2013. Head tactile communication: Promising technology with the design of a head-mounted tactile display. Ergonomics in Design: The Quarterly of Human Factors Applications 21, 2 (Apr. 2013), 4–8. DOI:https://doi.org/10.1177/1064804613477861Google ScholarCross Ref
- Kimberly Myles, Joel T. Kalb, Janea Lowery, and Bheem P. Kattel. 2015. The effect of hair density on the coupling between the tactor and the skin of the human head. Applied Ergonomics 48 (2015), 177–185. DOI:https://doi.org/10.1016/j.apergo.2014.11.007Google ScholarCross Ref
- NaturalPoint Inc. 2018. OptiTrack Tracking System. Retrieved from https://optitrack.com/.Google Scholar
- Tomi Nukarinen, Jussi Rantala, Ahmed Farooq, and Roope Raisamo. 2015. Delivering directional haptic cues through eyeglasses and a seat. In Proceedings of the 2015 IEEE World Haptics Conference. 345–350. DOI:https://doi.org/10.1109/WHC.2015.7177736Google ScholarCross Ref
- Qiangqiang Ouyang, Juan Wu, Zhiyu Shao, and Dapeng Chen. 2018. A vibrotactile belt to display precise directional information for visually impaired. IEICE Electronics Express 15, 20 (2018), 20180615–20180615. DOI:https://doi.org/10.1587/elex.15.20180615Google ScholarCross Ref
- Sabrina Paneels, Margarita Anastassova, Steven Strachan, Sophie Pham Van, Saranya Sivacoumarane, and Christian Bolzmacher. 2013. What’s around me? Multi-actuator haptic feedback on the wrist. In Proceedings of the 2013 World Haptics Conference. IEEE, 407–412. DOI:https://doi.org/10.1109/WHC.2013.6548443Google ScholarCross Ref
- Matteo Poggi and Stefano Mattoccia. 2016. A wearable mobility aid for the visually impaired based on embedded 3D vision and deep learning. Proceedings of the IEEE Symposium on Computers and Communications. 208–213. DOI:https://doi.org/10.1109/ISCC.2016.7543741Google ScholarCross Ref
- Precision Microdrives. 2017. Precision Microdrives 312-101. Retrieved from https://www.precisionmicrodrives.com/product/312-101-12mm-vibration-motor-3mm-type.Google Scholar
- Raspberry Pi Foundation. 2016. Raspberry Pi 3 Model B - Raspberry Pi. Retrieved from www.raspberrypi.org/.Google Scholar
- Stefanie Schaack, George Chernyshov, Kirill Ragozin, Benjamin Tag, Roshan Peiris, and Kai Kunze. 2019. Haptic Collar - vibrotactile feedback around the neck for guidance applications. In Proceedings of the 10th Augmented Human International Conference. ACM, New York, NY, 1–4. DOI:https://doi.org/10.1145/3311823.3311840Google ScholarDigital Library
- Boris Schauerte, Daniel Koester, Manel Martinez, and Rainer Stiefelhagen. 2015. Way to go! Detecting open areas ahead of a walking person. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Elsevier Ltd., 349–360. DOI:https://doi.org/10.1007/978-3-319-16199-0_25Google ScholarCross Ref
- S. Scheggi, A. Talarico, and D. Prattichizzo. 2014. A remote guidance system for blind and visually impaired people via vibrotactile haptic feedback. In Proceedings of the 22nd Mediterranean Conference on Control and Automation. IEEE, 20–23. DOI:https://doi.org/10.1109/MED.2014.6961320Google ScholarCross Ref
- Hervé Segond, Déborah Weiss, and Eliana Sampaio. 2005. Human spatial navigation via a visuo-tactile sensory substitution system. Perception 34, 10 (Oct. 2005), 1231–1249. DOI:https://doi.org/10.1068/p3409Google ScholarCross Ref
- Alexa F. Siu, Mike Sinclair, Robert Kovacs, Eyal Ofek, Christian Holz, and Edward Cutrell. 2020. Virtual reality without vision: A haptic and auditory white cane to navigate complex virtual worlds. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, 1–13. DOI:https://doi.org/10.1145/3313831.3376353Google ScholarDigital Library
- Koji Tsukada and Michiaki Yasumura. 2004. ActiveBelt: Belt-type wearable tactile display for directional navigation. In Ubiquitous Computing. Vol. 3205. Springer, Berlin, 384–399. DOI:https://doi.org/10.1007/978-3-540-30119-6_23Google ScholarCross Ref
- US Department of Transportation. 2006. Lesson 9: Walkways, sidewalks, and public spaces. Federal Highway Administration University Course on Bicycle and Pedestrian Transportation 1, July (2006), 452.Google Scholar
- Jan B. F. van Erp. 2001. Tactile navigation display. In Proceedings of the Haptic HCI 2000. 165–173. DOI:https://doi.org/10.1007/3-540-44589-7_18Google ScholarCross Ref
- Jan B. F. van Erp, Liselotte C. M. Kroon, Tina Mioch, and Katja I. Paul. 2017. Obstacle detection display for visually impaired: Coding of direction, distance, and height on a vibrotactile waist band. Frontiers in ICT 4, Sep (2017), 1–19. DOI:https://doi.org/10.3389/fict.2017.00023Google ScholarCross Ref
- Ramiro Velazquez, Edwige Pissaloux, Carolina Del-Valle-Soto, Aime Lay-Ekuakille, and Bruno Ando. 2020. Usability evaluation of foot-based interfaces for blind travelers. IEEE Instrumentation & Measurement Magazine 23, 4 (2020), 4–13. DOI:https://doi.org/10.1109/mim.2020.9126045Google ScholarCross Ref
- Washington State. 2020. Dispelling Myths, Department of Services for the Blind. Retrieved from https://dsb.wa.gov/resources/blind-awareness/dispelling-myths.Google Scholar
- White Cane Day. 2020. White Cane Day FAQ. Retrieved from http://whitecaneday.org/canes/.Google Scholar
- Yuhang Zhao, Cynthia L. Bennett, Hrvoje Benko, Edward Cutrell, Christian Holz, Meredith Ringel Morris, and Mike Sinclair. 2018. Demonstration of enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. Proceedings of the 2018 Conference on Human Factors in Computing Systems. 1–14. DOI:https://doi.org/10.1145/3170427.3186485Google ScholarDigital Library
Index Terms
- Around-the-Head Tactile System for Supporting Micro Navigation of People with Visual Impairments
Recommendations
Requirements of Navigation Support Systems for People with Visual Impairments
UbiComp '18: Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable ComputersTactile patterns are a means to convey general direction information to pedestrians (for example when turning right) and specific navigation instructions (for example when approaching the stairs). Tactile patterns are especially helpful for people with ...
Assistive Robots for Persons with Visual Impairments: Current Research and Open Challenges
PETRA '23: Proceedings of the 16th International Conference on PErvasive Technologies Related to Assistive EnvironmentsRobots have the potential to support persons with different abilities, including individuals with blindness or visual impairments (BVI). This paper aims to investigate the current developments in robotic research that focus on BVI robotic users and ...
Accessible Touchscreen Technology for People with Visual Impairments: A Survey
Touchscreens have become a de facto standard of input for mobile devices as they most optimally use the limited input and output space that is imposed by their form factor. In recent years, people who are blind and visually impaired have been increasing ...
Comments