skip to main content
10.1145/3372224.3418161acmconferencesArticle/Chapter ViewAbstractPublication PagesmobicomConference Proceedingsconference-collections
poster

What you wear know how you feel: an emotion inference system with multi-modal wearable devices

Published:18 September 2020Publication History

ABSTRACT

Emotions show high significance on human health. Automatic emotion recognition is helpful for monitoring psychological disorders, mental problems and exploring behavioral mechanisms. Existing approaches adopt costly and bulky specialized hardware such as EEG/ECG helmet, possess privacy risks, or with low accuracy and user experience. With the increasing popularity of wearables, people tend to equip multiple smart devices, which provides potential opportunity for emotion perception. In this paper, we present a pervasive and portable system called MW-Emotion to recognize common emotional states with multi-modal wearable devices. However, ubiquitous wearable devices perceive shallow information which is not obviously related to human emotions. MW-Emotion excavates intrinsic mapping relationship between emotions and sensing data. Our experiments show that MW-Emotion can recognize different emotion states with a relatively high accuracy of 83.1%.

References

  1. Margaret M Bradley and Peter J Lang. 1994. Measuring emotion: the self-assessment manikin and the semantic differential. Journal of behavior therapy and experimental psychiatry 25, 1 (1994), 49--59.Google ScholarGoogle ScholarCross RefCross Ref
  2. Margaret M Bradley and Peter J Lang. 2007. The International Affective Digitized Sounds (; IADS-2): Affective ratings of sounds and instruction manual. University of Florida, Gainesville, FL, Tech. Rep. B-3 (2007).Google ScholarGoogle Scholar
  3. Sandra Carvalho, Jorge Leite, Santiago Galdo-Álvarez, and Oscar F Gonçalves. 2012. The emotional movie database (EMDB): A self-report and psychophysiological study. Applied psychophysiology and biofeedback 37, 4 (2012), 279--294.Google ScholarGoogle Scholar
  4. Ruo-Nan Duan, Jia-Yi Zhu, and Bao-Liang Lu. 2013. Differential entropy feature for EEG-based emotion classification. In 2013 6th International IEEE/EMBS Conference on Neural Engineering (NER). IEEE, 81--84.Google ScholarGoogle ScholarCross RefCross Ref
  5. Shiqi Jiang, Zhenjiang Li, Pengfei Zhou, and Mo Li. 2019. Memento: An emotion-driven lifelogging system with wearables. ACM TOSN 15, 1 (2019), 1--23.Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Alexandre Schaefer, Frédéric Nils, Xavier Sanchez, and Pierre Philippot. 2010. Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers. Cognition and emotion 24, 7 (2010), 1153--1172.Google ScholarGoogle Scholar
  7. Mingmin Zhao, Fadel Adib, and Dina Katabi. 2016. Emotion recognition using wireless signals. In Proceedings of the ACM Mobicom. 95--108.Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. What you wear know how you feel: an emotion inference system with multi-modal wearable devices

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      MobiCom '20: Proceedings of the 26th Annual International Conference on Mobile Computing and Networking
      April 2020
      621 pages
      ISBN:9781450370851
      DOI:10.1145/3372224

      Copyright © 2020 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 18 September 2020

      Check for updates

      Qualifiers

      • poster

      Acceptance Rates

      Overall Acceptance Rate440of2,972submissions,15%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader