skip to main content
research-article

Around-the-Head Tactile System for Supporting Micro Navigation of People with Visual Impairments

Published:23 July 2021Publication History
Skip Abstract Section

Abstract

Tactile patterns are a means to convey navigation instructions to pedestrians and are especially helpful for people with visual impairments. This article presents a concept to provide precise micro-navigation instructions through a tactile around-the-head display. Our system presents four tactile patterns for fundamental navigation instructions in conjunction with continuous directional guidance. We followed an iterative, user-centric approach to design the patterns for the fundamental navigation instructions, combined them with a continuous directional guidance stimulus, and tested our system with 13 sighted (blindfolded) and 2 blind participants in an obstacle course, including stairs. We optimized the patterns and validated the final prototype with another five blind participants in a follow-up study. The system steered our participants successfully with a 5.7 cm average absolute deviation from the optimal path. Our guidance is only a little less precise than the usual shoulder wobbling during normal walking and an order of magnitude more precise than previous tactile navigation systems. Our system allows various new use cases of micro-navigation for people with visual impairments, e.g., preventing collisions on a sidewalk or as an anti-veering tool. It also has applications in other areas, such as personnel working in low-vision environments (e.g., firefighters).

References

  1. Anonymous. 2020. Pigpio library. Retrieved from http://abyz.me.uk/rpi/pigpio/.Google ScholarGoogle Scholar
  2. Apple Inc. 2020. Apple ARKit. Retrieved from https://developer.apple.com/augmented-reality/arkit/.Google ScholarGoogle Scholar
  3. Paul Bach-y Rita, Carter C. Collins, Frank A. Saunders, Benjamin White, and Lawrence Scadden. 1969. Vision substitution by tactile image projection. Nature 221, 5184 (Mar. 1969), 963–964. DOI:https://doi.org/10.1038/221963a0Google ScholarGoogle ScholarCross RefCross Ref
  4. P. Bach-y Rita, K. A. Kaczmarek, M. E. Tyler, and J. Garcia-Lara. 1998. Form perception with a 49-point electrotactile stimulus array on the tongue: a technical note.Journal of Rehabilitation Research and Development 35, 4 (1998), 427–430.Google ScholarGoogle Scholar
  5. Matthias Berning, Florian Braun, Till Riedel, and Michael Beigl. 2015. ProximityHat: a head-worn system for subtle sensory augmentation with tactile stimulation. In Proceedings of the 2015 ACM International Symposium on Wearable Computers (ISWC’15). ACM Press, New York, 31–38. DOI:https://doi.org/10.1145/2802083.2802088Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Lorna M. Brown, Stephen A. Brewster, and Helen C. Purchase. 2005. A first investigation into the effectiveness of Tactons. In Proceedings of the 1st Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems; World Haptics Conference (WHC’05). IEEE, 167–176. DOI:https://doi.org/10.1109/WHC.2005.6Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Leandro Cancar, Alex Díaz, Antonio Barrientos, David Travieso, and David M. Jacobs. 2013. Tactile-sight: A sensory substitution device based on distance-related vibrotactile flow. International Journal of Advanced Robotic Systems 10, 6 (Jun. 2013), 272. DOI:https://doi.org/10.5772/56235Google ScholarGoogle ScholarCross RefCross Ref
  8. Alvaro Cassinelli, Carson Reynolds, and Masatoshi Ishikawa. 2006. Augmenting spatial awareness with haptic radar. In Proceedings of the 2006 10th IEEE International Symposium on Wearable Computers. IEEE, 61–64. DOI:https://doi.org/10.1109/ISWC.2006.286344Google ScholarGoogle ScholarCross RefCross Ref
  9. Akansel Cosgun, E. Akin Sisbot, and Henrik I. Christensen. 2014. Evaluation of rotational and directional vibration patterns on a tactile belt for guiding visually impaired people. In Proceedings of the 2014 IEEE Haptics Symposium (HAPTICS’14). IEEE, 367–370. DOI:https://doi.org/10.1109/HAPTICS.2014.6775483Google ScholarGoogle ScholarCross RefCross Ref
  10. Ádám Csapó, György Wersényi, Hunor Nagy, and Tony Stockman. 2015. A survey of assistive technologies and applications for blind users on mobile platforms: a review and foundation for research. Journal on Multimodal User Interfaces 9, 4 (Dec. 2015), 275–286. DOI:https://doi.org/10.1007/s12193-015-0182-7Google ScholarGoogle ScholarCross RefCross Ref
  11. Victor Adriel de Jesus Oliveira, Luca Brayda, Luciana Nedel, and Anderson Maciel. 2017. Designing a vibrotactile head-mounted display for spatial awareness in 3D spaces. IEEE Transactions on Visualization and Computer Graphics 23, 4 (Apr. 2017), 1409–1417. DOI:https://doi.org/10.1109/TVCG.2017.2657238Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Victor Adriel de Jesus Oliveira, Luciana Nedel, Anderson Maciel, and Luca Brayda. 2018. Anti-veering vibrotactile HMD for assistance of blind pedestrians. In Proceedings of the EuroHaptics 2018. Domenico Prattichizzo, Hiroyuki Shinoda, Hong Z. Tan, Emanuele Ruffaldi, and Antonio Frisoli (Eds.), Lecture Notes in Computer Science, Vol. 10894. Springer International Publishing, 500–512. DOI:https://doi.org/10.1007/978-3-319-93399-3_43Google ScholarGoogle ScholarCross RefCross Ref
  13. Deutsches Institut für Normung. 2015. DIN 18065. Retrieved from https://www.din.de/en/getting-involved/standards-committees/ndr/standards/wdc-beuth:din21:227410112.Google ScholarGoogle Scholar
  14. Vincent Diener, Michael Beigl, Matthias Budde, and Erik Pescara. 2017. VibrationCap: studying vibrotactile localization on the human head with an unobtrusive wearable tactile display. In Proceedings of the International Symposium on Wearable Computers. 82–89. DOI:https://doi.org/10.1145/3123021.3123047Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Michal Karol Dobrzynski, Seifeddine Mejri, Steffen Wischmann, and Dario Floreano. 2012. Quantifying information transfer through a head-attached vibrotactile display: Principles for design and control. IEEE Transactions on Biomedical Engineering 59, 7 (Jul. 2012), 2011–2018. DOI:https://doi.org/10.1109/TBME.2012.2196433Google ScholarGoogle ScholarCross RefCross Ref
  16. German Flores, Sri Kurniawan, Roberto Manduchi, Eric Martinson, Lourdes M. Morales, and Emrah Akin Sisbot. 2015. Vibrotactile guidance for wayfinding of blind walkers. IEEE Transactions on Haptics 8, 3 (Jul. 2015), 306–317. DOI:https://doi.org/10.1109/TOH.2015.2409980Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Google. 2020. Google ARCore. Retrieved from https://developers.google.com/ar.Google ScholarGoogle Scholar
  18. Guiding Eyes for the Blind. 2020. How many people use guide dogs? Retrieved from https://www.guidingeyes.org/about/faqs/.Google ScholarGoogle Scholar
  19. Marion Hersh. 2015. Cane use and late onset visual impairment. Technology and Disability 27, 3 (2015), 103–116. DOI:https://doi.org/10.3233/TAD-150432Google ScholarGoogle ScholarCross RefCross Ref
  20. Wilko Heuten, Niels Henze, Susanne Boll, and Martin Pielot. 2008. Tactile wayfinder: A non-visual support system for wayfinding. In Proceedings of the 5th Nordic Conference on Human-Computer Interaction Building Bridges (NordiCHI’08). ACM Press, New York, 172. DOI:https://doi.org/10.1145/1463160.1463179Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Weijian Hu, Kaiwei Wang, Kailun Yang, Ruiqi Cheng, Yaozu Ye, Lei Sun, and Zhijie Xu. 2020. A comparative study in real-time scene sonification for visually impaired people. Sensors (Switzerland) 20, 11 (2020), 1–17. DOI:https://doi.org/10.3390/s20113222Google ScholarGoogle ScholarCross RefCross Ref
  22. Ali Israr and Ivan Poupyrev. 2011. Tactile brush. In Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems. ACM Press, New York, 2019. DOI:https://doi.org/10.1145/1978942.1979235Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Lynette A. Jones and Nadine B. Sarter. 2008. Tactile displays: Guidance for their design and application. Human Factors 50, 1 (Feb. 2008), 90–111. DOI:https://doi.org/10.1518/001872008X250638Google ScholarGoogle ScholarCross RefCross Ref
  24. H. Kajimoto, Y. Kanno, and S. Tachi. 2006. Forehead electro-tactile display for vision substitution. In Proceedings of the EuroHaptics. Retrieved from http://lsc.univ-evry.fr/~eurohaptics/upload/cd/papers/f62.pdf%5Cnpap ers2://publication/uuid/DA67D1A2-09FE-47CF-BC2D-F12B7393E338.Google ScholarGoogle Scholar
  25. Brian F. G. Katz, Slim Kammoun, Gaëtan Parseihian, Olivier Gutierrez, Adrien Brilhault, Malika Auvray, Philippe Truillet, Michel Denis, Simon Thorpe, and Christophe Jouffrais. 2012. NAVIG: augmented reality guidance system for the visually impaired. Virtual Reality 16, 4 (Nov. 2012), 253–269. DOI:https://doi.org/10.1007/s10055-012-0213-6Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Oliver Beren Kaul, Max Pfeiffer, and Michael Rohs. 2016. Follow the force: Steering the index finger towards targets using EMS. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, New York, NY, 2526–2532. DOI:https://doi.org/10.1145/2851581.2892352Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. Oliver Beren Kaul and Michael Rohs. 2016. HapticHead: 3D guidance and target acquisition through a vibrotactile grid. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. ACM, New York, NY, 2533–2539. DOI:https://doi.org/10.1145/2851581.2892355Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Oliver Beren Kaul and Michael Rohs. 2017. HapticHead: A spherical vibrotactile grid around the head for 3D guidance in virtual and augmented reality. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (CHI’17). ACM Press, New York, NY, 3729–3740. DOI:https://doi.org/10.1145/3025453.3025684Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. Oliver Beren Kaul, Michael Rohs, and Marc Mogalle. 2020. Design and evaluation of on-the-head spatial tactile patterns. In Proceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia. ACM, New York, NY, 229–239. DOI:https://doi.org/10.1145/3428361.3428407Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Oliver Beren Kaul, Michael Rohs, Benjamin Simon, Kerem Can Demir, and Kamillo Ferry. 2020. Vibrotactile funneling illusion and localization performance on the head. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, 1–13. DOI:https://doi.org/10.1145/3313831.3376335Google ScholarGoogle ScholarDigital LibraryDigital Library
  31. Hamideh Kerdegari, Yeongmi Kim, and Tony J. Prescott. 2016. Head-mounted sensory augmentation device: comparing haptic and audio modality. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 9793. Springer International Publishing, 107–118. DOI:https://doi.org/10.1007/978-3-319-42417-0_11Google ScholarGoogle ScholarCross RefCross Ref
  32. Lofelt GmbH. 2019. Elevating Haptic Technology with Lofelt Wave. Retrieved from https://lofelt.com/white-paper.Google ScholarGoogle Scholar
  33. Jack M. Loomis, Reginald G. Golledge, Roberta L. Klatzky, James R. Marston, and Gary L. Ed Allen. 2007. Assisting wayfinding in visually impaired travelers. Applied Spatial Cognition From Research to Cognitive Technology 1, 60587 (2007), 179–202.Google ScholarGoogle Scholar
  34. Ethan Luckett. 2018. A Quantitative Evaluation of the HTC Vive for Virtual Reality Research. Ph.D. Dissertation. University of Mississippi.Google ScholarGoogle Scholar
  35. Manuel Martinez, Alina Roitberg, Daniel Koester, Rainer Stiefelhagen, and Boris Schauerte. 2017. Using technology developed for autonomous cars to help navigate blind people. Proceedings of the 2017 IEEE International Conference on Computer Vision Workshops (ICCVW’17). 1424–1432. DOI:https://doi.org/10.1109/ICCVW.2017.169Google ScholarGoogle ScholarCross RefCross Ref
  36. Manuel Martinez, Kailun Yang, Angela Constantinescu, and Rainer Stiefelhagen. 2020. Helping the blind to get through COVID-19: Social distancing assistant using real-time semantic segmentation on rgb-d video. Sensors (Switzerland) 20, 18 (2020), 1–17. DOI:https://doi.org/10.3390/s20185202Google ScholarGoogle ScholarCross RefCross Ref
  37. Akira Matsuda, Kazunori Nozawa, Kazuki Takata, Atsushi Izumihara, and Jun Rekimoto. 2020. HapticPointer: A neck-worn device that presents direction by vibrotactile feedback for remote collaboration tasks. In Proceedings of the Augmented Humans International Conference. ACM, New York, NY, 1–10. DOI:https://doi.org/10.1145/3384657.3384777Google ScholarGoogle ScholarDigital LibraryDigital Library
  38. Renato Melo, Polyanna Amorim da Silva, Robson Souza, Maria Raposo, and Karla Ferraz. 2013. Head position comparison between students with normal hearing and students with sensorineural hearing loss. International Archives of Otorhinolaryngology 17, 04 (Sep. 2013), 363–369. DOI:https://doi.org/10.1055/s-0033-1351685Google ScholarGoogle ScholarCross RefCross Ref
  39. Kimberly Myles and Joel T. Kalb. 2009. Vibrotactile Sensitivity of the Head. Defense Technical Information Center. https://apps.dtic.mil/sti/citations/ADA499558.Google ScholarGoogle Scholar
  40. Kimberly Myles and Joel T. Kalb. 2010. Guidelines for head tactile communication. Army Research Laboratory 1, March (2010), 26. Retrieved from http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA519112.Google ScholarGoogle Scholar
  41. K. Myles and J. T. Kalb. 2013. Head tactile communication: Promising technology with the design of a head-mounted tactile display. Ergonomics in Design: The Quarterly of Human Factors Applications 21, 2 (Apr. 2013), 4–8. DOI:https://doi.org/10.1177/1064804613477861Google ScholarGoogle ScholarCross RefCross Ref
  42. Kimberly Myles, Joel T. Kalb, Janea Lowery, and Bheem P. Kattel. 2015. The effect of hair density on the coupling between the tactor and the skin of the human head. Applied Ergonomics 48 (2015), 177–185. DOI:https://doi.org/10.1016/j.apergo.2014.11.007Google ScholarGoogle ScholarCross RefCross Ref
  43. NaturalPoint Inc. 2018. OptiTrack Tracking System. Retrieved from https://optitrack.com/.Google ScholarGoogle Scholar
  44. Tomi Nukarinen, Jussi Rantala, Ahmed Farooq, and Roope Raisamo. 2015. Delivering directional haptic cues through eyeglasses and a seat. In Proceedings of the 2015 IEEE World Haptics Conference. 345–350. DOI:https://doi.org/10.1109/WHC.2015.7177736Google ScholarGoogle ScholarCross RefCross Ref
  45. Qiangqiang Ouyang, Juan Wu, Zhiyu Shao, and Dapeng Chen. 2018. A vibrotactile belt to display precise directional information for visually impaired. IEICE Electronics Express 15, 20 (2018), 20180615–20180615. DOI:https://doi.org/10.1587/elex.15.20180615Google ScholarGoogle ScholarCross RefCross Ref
  46. Sabrina Paneels, Margarita Anastassova, Steven Strachan, Sophie Pham Van, Saranya Sivacoumarane, and Christian Bolzmacher. 2013. What’s around me? Multi-actuator haptic feedback on the wrist. In Proceedings of the 2013 World Haptics Conference. IEEE, 407–412. DOI:https://doi.org/10.1109/WHC.2013.6548443Google ScholarGoogle ScholarCross RefCross Ref
  47. Matteo Poggi and Stefano Mattoccia. 2016. A wearable mobility aid for the visually impaired based on embedded 3D vision and deep learning. Proceedings of the IEEE Symposium on Computers and Communications. 208–213. DOI:https://doi.org/10.1109/ISCC.2016.7543741Google ScholarGoogle ScholarCross RefCross Ref
  48. Precision Microdrives. 2017. Precision Microdrives 312-101. Retrieved from https://www.precisionmicrodrives.com/product/312-101-12mm-vibration-motor-3mm-type.Google ScholarGoogle Scholar
  49. Raspberry Pi Foundation. 2016. Raspberry Pi 3 Model B - Raspberry Pi. Retrieved from www.raspberrypi.org/.Google ScholarGoogle Scholar
  50. Stefanie Schaack, George Chernyshov, Kirill Ragozin, Benjamin Tag, Roshan Peiris, and Kai Kunze. 2019. Haptic Collar - vibrotactile feedback around the neck for guidance applications. In Proceedings of the 10th Augmented Human International Conference. ACM, New York, NY, 1–4. DOI:https://doi.org/10.1145/3311823.3311840Google ScholarGoogle ScholarDigital LibraryDigital Library
  51. Boris Schauerte, Daniel Koester, Manel Martinez, and Rainer Stiefelhagen. 2015. Way to go! Detecting open areas ahead of a walking person. In Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Elsevier Ltd., 349–360. DOI:https://doi.org/10.1007/978-3-319-16199-0_25Google ScholarGoogle ScholarCross RefCross Ref
  52. S. Scheggi, A. Talarico, and D. Prattichizzo. 2014. A remote guidance system for blind and visually impaired people via vibrotactile haptic feedback. In Proceedings of the 22nd Mediterranean Conference on Control and Automation. IEEE, 20–23. DOI:https://doi.org/10.1109/MED.2014.6961320Google ScholarGoogle ScholarCross RefCross Ref
  53. Hervé Segond, Déborah Weiss, and Eliana Sampaio. 2005. Human spatial navigation via a visuo-tactile sensory substitution system. Perception 34, 10 (Oct. 2005), 1231–1249. DOI:https://doi.org/10.1068/p3409Google ScholarGoogle ScholarCross RefCross Ref
  54. Alexa F. Siu, Mike Sinclair, Robert Kovacs, Eyal Ofek, Christian Holz, and Edward Cutrell. 2020. Virtual reality without vision: A haptic and auditory white cane to navigate complex virtual worlds. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. ACM, New York, NY, 1–13. DOI:https://doi.org/10.1145/3313831.3376353Google ScholarGoogle ScholarDigital LibraryDigital Library
  55. Koji Tsukada and Michiaki Yasumura. 2004. ActiveBelt: Belt-type wearable tactile display for directional navigation. In Ubiquitous Computing. Vol. 3205. Springer, Berlin, 384–399. DOI:https://doi.org/10.1007/978-3-540-30119-6_23Google ScholarGoogle ScholarCross RefCross Ref
  56. US Department of Transportation. 2006. Lesson 9: Walkways, sidewalks, and public spaces. Federal Highway Administration University Course on Bicycle and Pedestrian Transportation 1, July (2006), 452.Google ScholarGoogle Scholar
  57. Jan B. F. van Erp. 2001. Tactile navigation display. In Proceedings of the Haptic HCI 2000. 165–173. DOI:https://doi.org/10.1007/3-540-44589-7_18Google ScholarGoogle ScholarCross RefCross Ref
  58. Jan B. F. van Erp, Liselotte C. M. Kroon, Tina Mioch, and Katja I. Paul. 2017. Obstacle detection display for visually impaired: Coding of direction, distance, and height on a vibrotactile waist band. Frontiers in ICT 4, Sep (2017), 1–19. DOI:https://doi.org/10.3389/fict.2017.00023Google ScholarGoogle ScholarCross RefCross Ref
  59. Ramiro Velazquez, Edwige Pissaloux, Carolina Del-Valle-Soto, Aime Lay-Ekuakille, and Bruno Ando. 2020. Usability evaluation of foot-based interfaces for blind travelers. IEEE Instrumentation & Measurement Magazine 23, 4 (2020), 4–13. DOI:https://doi.org/10.1109/mim.2020.9126045Google ScholarGoogle ScholarCross RefCross Ref
  60. Washington State. 2020. Dispelling Myths, Department of Services for the Blind. Retrieved from https://dsb.wa.gov/resources/blind-awareness/dispelling-myths.Google ScholarGoogle Scholar
  61. White Cane Day. 2020. White Cane Day FAQ. Retrieved from http://whitecaneday.org/canes/.Google ScholarGoogle Scholar
  62. Yuhang Zhao, Cynthia L. Bennett, Hrvoje Benko, Edward Cutrell, Christian Holz, Meredith Ringel Morris, and Mike Sinclair. 2018. Demonstration of enabling people with visual impairments to navigate virtual reality with a haptic and auditory cane simulation. Proceedings of the 2018 Conference on Human Factors in Computing Systems. 1–14. DOI:https://doi.org/10.1145/3170427.3186485Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Around-the-Head Tactile System for Supporting Micro Navigation of People with Visual Impairments

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in

          Full Access

          • Published in

            cover image ACM Transactions on Computer-Human Interaction
            ACM Transactions on Computer-Human Interaction  Volume 28, Issue 4
            August 2021
            297 pages
            ISSN:1073-0516
            EISSN:1557-7325
            DOI:10.1145/3477419
            Issue’s Table of Contents

            Copyright © 2021 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 23 July 2021
            • Revised: 1 March 2021
            • Accepted: 1 March 2021
            • Received: 1 July 2020
            Published in tochi Volume 28, Issue 4

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article
            • Research
            • Refereed

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader

          HTML Format

          View this article in HTML Format .

          View HTML Format