skip to main content
10.1145/2493988.2494335acmconferencesArticle/Chapter ViewAbstractPublication PagesubicompConference Proceedingsconference-collections
research-article

Ultrasound-based movement sensing, gesture-, and context-recognition

Published:08 September 2013Publication History

ABSTRACT

We propose an activity and context recognition method where the user carries a neck-worn receiver comprising a microphone, and small speakers on his wrists that generate ultrasounds. The system recognizes gestures on the basis of the volume of the received sound and the Doppler effect. The former indicates the distance between the neck and wrists, and the later indicates the speed of motions. Thus, our approach substitutes the wired or wireless communication typically required in body area motion sensing networks by ultrasounds. Our system also recognizes the place where the user is in and the people who are near the user by ID signals generated from speakers placed in rooms and on people. The strength of the approach is that, for offline recognition, a simple audio recorder can be used for the receiver. We evaluate the approach in one scenario on nine gestures/activities with 10 users. Evaluation results confirmed that when there was no environmental sound generated from other people, the recognition rate was 87% on average. When there was environmental sound generated from other people, we compare approach ultrasound-based recognition which uses only the feature value of ultrasound against standard approach, which uses feature value of ultrasound and environmental sound. Results for the proposed approach are 65%, for the standard approach are 57%.

References

  1. Activities of daily living: What are adls and iadls? http://www.caring.com\slasharticles\slashactivities-of-daily-living-what-are-adls-and-iadls.Google ScholarGoogle Scholar
  2. Frequency range of human hearing. http://hypertextbook.com/facts/2003/ChrisDAmbrose.shtml.Google ScholarGoogle Scholar
  3. Bao, L., and Intille, S. S. Activity recognition from user-annotated acceleration data. In Pervasive Computing. Springer, 2004, 1--17.Google ScholarGoogle Scholar
  4. Duong, T. V., Bui, H. H., Phung, D. Q., and Venkatesh, S. Activity recognition and abnormality detection with the switching hidden semi-markov model. In Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on, vol. 1, IEEE (2005), 838--845. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Gupta, S., Morris, D., Patel, S., and Tan, D. Soundwave: using the doppler effect to sense gestures. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems, ACM (2012), 1911--1914. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Iso, T., and Yamazaki, K. Gait analyzer based on a cell phone with a single three-axis accelerometer. In Proceedings of the 8th conference on Human-computer interaction with mobile devices and services, ACM (2006), 141--144. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Kalgaonkar, K., and Raj, B. One-handed gesture recognition using ultrasonic doppler sonar. In Acoustics, Speech and Signal Processing, 2009. ICASSP 2009. IEEE International Conference on, IEEE (2009), 1889--1892. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Mattmann, C., Amft, O., Harms, H., Troster, G., and Clemens, F. Recognizing upper body postures using textile strain sensors. In Wearable Computers, 2007 11th IEEE International Symposium on, IEEE (2007), 29--36. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Muller, H. L., McCarthy, M., and Randell, C. Particle filters for position sensing with asynchronous ultrasonic beacons. In Location-and Context-Awareness. Springer, 2006, 1--13. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Murao, K., Terada, T., Yano, A., and Matsukura, R. Evaluating gesture recognition by multiple-sensor-containing mobile devices. In Wearable Computers (ISWC), 2011 15th Annual International Symposium on, IEEE (2011), 55--58. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Naya, F., Ohmura, R., Takayanagi, F., Noma, H., and Kogure, K. Workers' routine activity recognition using body movements and location information. In Wearable Computers, 2006 10th IEEE International Symposium on, IEEE (2006), 105--108.Google ScholarGoogle ScholarCross RefCross Ref
  12. Ogris, G., Stiefmeier, T., Junker, H., Lukowicz, P., and Troster, G. Using ultrasonic hand tracking to augment motion analysis based recognition of manipulative gestures. In Wearable Computers, 2005. Proceedings. Ninth IEEE International Symposium on, IEEE (2005), 152--159. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Pirkl, G., Stockinger, K., Kunze, K., and Lukowicz, P. Adapting magnetic resonant coupling based relative positioning technology for wearable activitiy recogniton. In Wearable Computers, 2008. ISWC 2008. 12th IEEE International Symposium on, IEEE (2008), 47--54. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Starner, T., Auxier, J., Ashbrook, D., and Gandy, M. The gesture pendant: A self-illuminating, wearable, infrared computer vision system for home automation control and medical monitoring. In Wearable Computers, The Fourth International Symposium on, IEEE (2000), 87--94. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Ward, A., Jones, A., and Hopper, A. A new location technique for the active office. Personal Communications, IEEE 4, 5 (1997), 42--47.Google ScholarGoogle ScholarCross RefCross Ref
  16. Ward, J. A., Lukowicz, P., Troster, G., and Starner, T. E. Activity recognition of assembly tasks using body-worn microphones and accelerometers. Pattern Analysis and Machine Intelligence, IEEE Transactions on 28, 10 (2006), 1553--1567. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Ultrasound-based movement sensing, gesture-, and context-recognition

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader