ABSTRACT
Exploiting emotions for user interface evaluation became an increasingly important research objective in Human-Computer Interaction. Emotions are usually assessed through surveys that do not allow information to be collected in real-time. In our work, we suggest the use of smartphones for mobile emotion assessment. We use the front-facing smartphone camera as a tool for emotion detection based on facial expressions. Such information can be used to reflect on emotional states or provide emotion-aware user interface adaptation. We collected facial expressions along with app usage data in a two-week field study consisting of a one-week training phase and a one-week testing phase. We built and evaluated a person-dependent classifier, yielding an average classification improvement of 33% compared to classifying facial expressions only. Furthermore, we correlate the estimated emotions with concurrent app usage to draw insights into changes in mood. Our work is complemented by a discussion of the feasibility of probing emotions on-the-go and potential use cases for future emotion-aware applications.
- Jussi Ängeslevä, Carson Reynolds, and Sile O'Modhrain. 2004. EmoteMail. In ACM SIGGRAPH 2004 Posters (Los Angeles, California) (SIGGRAPH '04). ACM, New York, NY, USA. https://doi.org/10.1145/1186415.1186426Google ScholarDigital Library
- Yadid Ayzenberg, Javier Hernandez Rivera, and Rosalind Picard. 2012. FEEL: Frequent EDA and Event Logging - a Mobile Social Interaction Stress Monitoring System. In CHI '12 Extended Abstracts on Human Factors in Computing Systems (Austin, Texas, USA) (CHI EA '12). Association for Computing Machinery, New York, NY, USA, 2357--2362. https://doi.org/10.1145/2212776.2223802Google ScholarDigital Library
- Tadas Baltrusaitis, Peter Robinson, and Louis-Philippe Morency. 2016. OpenFace: An open source facial behavior analysis toolkit. In 2016 IEEE Winter Conference on Applications of Computer Vision (WACV). 1--10. https://doi.org/10.1109/WACV.2016.7477553Google ScholarCross Ref
- Marian Stewart Bartlett, Gwen Littlewort, Ian Fasel, and Javier R. Movellan. 2003. Real Time Face Detection and Facial Expression Recognition: Development and Applications to Human Computer Interaction.. In 2003 Conference on Computer Vision and Pattern Recognition Workshop, Vol. 5. 53--53. https://doi.org/10.1109/CVPRW.2003.10057Google ScholarCross Ref
- Karen Church, Eve Hoggan, and Nuria Oliver. 2010. A Study of Mobile Mood Awareness and Communication Through MobiMood. In Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries (Reykjavik, Iceland) (NordiCHI '10). ACM, New York, NY, USA, 128--137. https://doi.org/10.1145/1868914.1868933Google ScholarDigital Library
- Yanqing Cui, Jari Kangas, Jukka Holm, and Guido Grassel. 2013. Front-camera Video Recordings As Emotion Responses to Mobile Photos Shared Within Close-knit Groups. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Paris, France) (CHI '13). ACM, New York, NY, USA, 981--990. https://doi.org/10.1145/2470654.2466125Google ScholarDigital Library
- Munmun De Choudhury, Michael Gamon, Scott Counts, and Eric Horvitz. 2013. Predicting Depression via Social Media. ICWSM 13 (2013), 1--10.Google Scholar
- Jozefien De Leersnyder, Michael Boiger, and Batja Mesquita. 2013. Cultural regulation of emotion: Individual, relational, and structural sources. Frontiers in psychology 4 (2013).Google Scholar
- Hui Ding, Shaohua K. Zhou, and Rama Chellappa. 2017. FaceNet2ExpNet: Regularizing a Deep Face Recognition Net for Expression Recognition. In 2017 12th IEEE International Conference on Automatic Face Gesture Recognition (FG 2017). 118--126. https://doi.org/10.1109/FG.2017.23Google ScholarDigital Library
- Tilman Dingler. 2016. Cognition-aware Systems As Mobile Personal Assistants. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct (Heidelberg, Germany) (UbiComp '16). ACM, New York, NY, USA, 1035--1040. https://doi.org/10.1145/2968219.2968565Google ScholarDigital Library
- Paul Ekman. 1984. Expression and the nature of emotion. Approaches to emotion 3 (1984), 19--344.Google Scholar
- Paul Ekman. 1993. Facial expression and emotion. American psychologist 48, 4 (1993), 384. https://doi.org/10.1037/0003-066X.48.4.384Google Scholar
- Paul Ekman. 1999. Facial expressions. Handbook of cognition and emotion 16 (1999), 301--320.Google Scholar
- Paul Ekman, Wallace V Friesen, Maureen O'sullivan, Anthony Chan, Irene Diacoyanni-Tarlatzis, Karl Heider, Rainer Krause, William Ayhan LeCompte, Tom Pitcairn, Pio E Ricci-Bitti, et al. 1987. Universals and cultural differences in the judgments of facial expressions of emotion. Journal of personality and social psychology 53, 4 (1987), 712. https://doi.org/10.1037/0022--3514.53.4.712Google ScholarCross Ref
- Paul Ekman and Harriet Oster. 1979. Facial expressions of emotion. Annual review of psychology 30, 1 (1979), 527--554.Google Scholar
- Paul Ekman and Erika L Rosenberg. 1997. What the face reveals: Basic and applied studies of spontaneous expression using the Facial Action Coding System (FACS). Oxford University Press, USA.Google Scholar
- Rana El Kaliouby and Peter Robinson. 2004. FAIM: Integrating Automated Facial Affect Analysis in Instant Messaging. In Proceedings of the 9th International Conference on Intelligent User Interfaces (Funchal, Madeira, Portugal) (IUI '04). ACM, New York, NY, USA, 244--246. https://doi.org/10.1145/964442.964493Google ScholarDigital Library
- Rana el Kaliouby, Alea Teeters, and Rosalind W. Picard. 2006. An exploratory social-emotional prosthetic for autism spectrum disorders. In International Workshop on Wearable and Implantable Body Sensor Networks (BSN'06). 2 pp.-4. https://doi.org/10.1109/BSN.2006.34Google ScholarDigital Library
- Jackson Feijó Filho, Thiago Valle, and Wilson Prata. 2014. Non-verbal communications in mobile text chat: emotion-enhanced mobile chat. In Proceedings of the 16th international conference on Human-computer interaction with mobile devices & services. ACM, 443--446.Google Scholar
- Marian Harbach, Emanuel Von Zezschwitz, Andreas Fichtner, Alexander De Luca, and Matthew Smith. 2014. It's a hard lock life: A field study of smartphone (un) locking behavior and risk perception. In Symposium on usable privacy and security (SOUPS). 9--11.Google Scholar
- Behzad Hasani and Mohammad H Mahoor. 2017. Spatio-Temporal Facial Expression Recognition Using Convolutional Neural Networks and Conditional Random Fields. arXiv preprint arXiv:1703.06995 (2017).Google Scholar
- Mariam Hassib, Daniel Buschek, Paweł W. Wozniak, and Florian Alt. 2017. HeartChat: Heart Rate Augmented Mobile Chat to Support Empathy and Awareness. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems (Denver, Colorado, USA) (CHI '17). ACM, New York, NY, USA, 2239--2251. https://doi.org/10.1145/3025453.3025758Google ScholarDigital Library
- Mariam Hassib, Mohamed Khamis, Stefan Schneegass, Ali Sahami Shirazi, and Florian Alt. 2016. Investigating User Needs for Bio-sensing and Affective Wearables. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (San Jose, California, USA) (CHI EA '16). ACM, New York, NY, USA, 1415--1422. https://doi.org/10.1145/2851581.2892480Google ScholarDigital Library
- Javier Hernandez, Daniel McDuff, Christian Infante, Pattie Maes, Karen Quigley, and Rosalind Picard. 2016. Wearable ESM: Differences in the experience sampling method across wearable devices. In Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services. ACM, 195--205. https://doi.org/10.1145/2935334.2935340Google ScholarDigital Library
- Ursula Hess, Rainer Banse, and Arvid Kappas. 1995. The intensity of facial expression is determined by underlying affective state and social situation. Journal of personality and social psychology 69, 2 (1995), 280. https://doi.org/10.1037/0022--3514.69.2.280Google ScholarCross Ref
- Qiong Huang, Ashok Veeraraghavan, and Ashutosh Sabharwal. 2017. TabletGaze: dataset and analysis for unconstrained appearance-based gaze estimation in mobile tablets. Machine Vision and Applications 28, 5 (01 Aug 2017), 445--461. https://doi.org/10.1007/s00138-017-0852-4Google ScholarDigital Library
- Mohamed Khamis, Anita Baier, Niels Henze, Florian Alt, and Andreas Bulling. 2018. Understanding Face and Eye Visibility in Front-Facing Cameras of Smart-phones used in the Wild. Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems 36 (2018), 5. https://doi.org/10.1145/3152832.3173854Google Scholar
- Kyung Hwan Kim, Seok Won Bang, and Sang Ryong Kim. 2004. Emotion recognition system using short-term monitoring of physiological signals. Medical and biological engineering and computing 42, 3 (2004), 419--427.Google Scholar
- Thomas Kosch, Mariam Hassib, Daniel Buschek, and Albrecht Schmidt. 2018. Look into My Eyes: Using Pupil Dilation to Estimate Mental Workload for Task Complexity Adaptation. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI EA '18). ACM, New York, NY, USA, Article LBW617, 6 pages. https://doi.org/10.1145/3170427.3188643Google ScholarDigital Library
- Thomas Kosch, Mariam Hassib, Paweł W. Woźniak, Daniel Buschek, and Florian Alt. 2018. Your Eyes Tell: Leveraging Smooth Pursuit for Assessing Cognitive Workload. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI '18). ACM, New York, NY, USA, Article 436, 13 pages. https://doi.org/10.1145/3173574.3174010Google ScholarDigital Library
- Thomas Kosch, Jakob Karolus, Havy Ha, and Albrecht Schmidt. 2019. Your Skin Resists: Exploring Electrodermal Activity As Workload Indicator During Manual Assembly. In Proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems (Valencia, Spain) (EICS '19). ACM, New York, NY, USA, Article 8, 5 pages. https://doi.org/10.1145/3319499.3328230Google ScholarDigital Library
- Thomas Kosch, Albrecht Schmidt, Simon Thanheiser, and Lewis L. Chuang. 2020. One does not Simply RSVP: Mental Workload to Select Speed Reading Parameters using Electroencephalography. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI '20). ACM, New York, NY, USA. https://doi.org/10.1145/3313831.3376766Google ScholarDigital Library
- Myungho Lee, Kangsoo Kim, Hyunghwan Rho, and Si Jung Kim. 2014. Empa Talk: A Physiological Data Incorporated Human-computer Interactions. In Proceedings of the Extended Abstracts of the 32Nd Annual ACM Conference on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI EA '14). ACM, New York, NY, USA, 1897--1902. https://doi.org/10.1145/2559206.2581370Google ScholarDigital Library
- Yu-Kang Lee, Chun-Tuan Chang, You Lin, and Zhao-Hong Cheng. 2014. The dark side of smartphone usage: Psychological traits, compulsive behavior and technostress. Computers in Human Behavior 31, Supplement C (2014), 373--383. https://doi.org/10.1016/j.chb.2013.10.047Google ScholarDigital Library
- Robert LiKamWa, Yunxin Liu, Nicholas D. Lane, and Lin Zhong. 2013. MoodScope: Building a Mood Sensor from Smartphone Usage Patterns. In Proceeding of the 11th Annual International Conference on Mobile Systems, Applications, and Services (Taipei, Taiwan) (MobiSys '13). ACM, New York, NY, USA, 389--402. https://doi.org/10.1145/2462456.2464449Google ScholarDigital Library
- Zhe Liu, Anbang Xu, Yufan Guo, Jalal U. Mahmud, Haibin Liu, and Rama Akkiraju. 2018. Seemo: A Computational Approach to See Emotions. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (Montreal QC, Canada) (CHI '18). ACM, New York, NY, USA, Article 364, 12 pages. https://doi.org/10.1145/3173574.3173938Google ScholarDigital Library
- Ludo Maat and Maja Pantic. 2007. Gaze-X: Adaptive, affective, multimodal interface for single-user office scenarios. In Artifical Intelligence for Human Computing. Springer, 251--271.Google Scholar
- Daniel McDuff, Amy Karlson, Ashish Kapoor, Asta Roseway, and Mary Czerwinski. 2012. AffectAura: An Intelligent System for Emotional Memory. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (Austin, Texas, USA) (CHI '12). ACM, New York, NY, USA, 849--858. https://doi.org/10.1145/2207676.2208525Google ScholarDigital Library
- Daniel McDuff, Abdelrahman Mahmoud, Mohammad Mavadati, May Amr, Jay Turcot, and Rana el Kaliouby. 2016. AFFDEX SDK: A Cross-Platform Real-Time Multi-Face Expression Recognition Toolkit. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems (Santa Clara, California, USA) (CHI EA '16). ACM, New York, NY, USA, 3723--3726. https://doi.org/10.1145/2851581.2890247Google ScholarDigital Library
- Mohammad Obaid, Charles Han, and Mark Billinghurst. 2008. "Feed the Fish: An Affect-aware Game. In Proceedings of the 5th Australasian Conference on Interactive Entertainment (Brisbane, Queensland, Australia) (IE '08). ACM, New York, NY, USA, Article 6, 6 pages. https://doi.org/10.1145/1514402.1514408Google ScholarDigital Library
- Andreas Fsrøvig Olsen and Jim Torresen. 2016. Smartphone accelerometer data used for detecting human emotions. In Systems and Informatics (ICSAI), 2016 3rd International Conference on. IEEE, IEEE, New York, NY, USA, 410--415. https://doi.org/10.1109/ICSAI.2016.7810990Google ScholarCross Ref
- Rosalind W. Picard. 2003. Affective computing: challenges. International Journal of Human-Computer Studies 59, 1 (2003), 55--64. https://doi.org/10.1016/S1071-5819(03)00052-1 Applications of Affective Computing in Human-Computer Interaction.Google ScholarDigital Library
- Rosalind W. Picard, Elias Vyzas, and Jennifer Healey. 2001. Toward machine emotional intelligence: analysis of affective physiological state. IEEE Transactions on Pattern Analysis and Machine Intelligence 23, 10 (Oct 2001), 1175--1191. https://doi.org/10.1109/34.954607Google ScholarDigital Library
- Martin Pielot, Tilman Dingler, Jose San Pedro, and Nuria Oliver. 2015. When Attention is Not Scarce - Detecting Boredom from Mobile Phone Usage. In Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (Osaka, Japan) (UbiComp '15). ACM, New York, NY, USA, 825--836. https://doi.org/10.1145/2750858.2804252Google ScholarDigital Library
- Paul Rodriguez, Guillem Cucurull, Jordi Gonzàlez, Josep M. Gonfaus, Kamal Nasrollahi, Thomas B. Moeslund, and F. Xavier Roca. 2017. Deep Pain: Exploiting Long Short-Term Memory Networks for Facial Expression Classification. IEEE Transactions on Cybernetics PP, 99 (2017), 1--11. https://doi.org/10.1109/TCYB.2017.2662199Google Scholar
- Tobias Ruf, Andreas Ernst, and Christian Küblbeck. 2011. Face Detection with the Sophisticated High-speed Object Recognition Engine (SHORE). Springer Berlin Heidelberg, Berlin, Heidelberg, 243--252. https://doi.org/10.1007/978-3-642-23071-4_23Google Scholar
- James A Russell. 1994. Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. Psychological bulletin 115, 1 (1994), 102. https://doi.org/10.1037/0033--2909.115.1.102Google Scholar
- Jan Scholz, Miriam C Klein, Timothy EJ Behrens, and Heidi Johansen-Berg. 2009. Training induces changes in white-matter architecture. Nature neuroscience 12, 11 (2009), 1370--1371. https://doi.org/10.1038/nn.2412Google Scholar
- Claudia Schrader, Julia Brich, Julian Frommel, Valentin Riemer, and Katja Rogers. 2017. Rising to the challenge: An emotion-driven approach toward adaptive serious games. In Serious Games and Edutainment Applications. Springer, New York, NY, USA, 3--28.Google Scholar
- Shams Shapsough, Ahmed Hesham, Youssef Elkhorazaty, Imran A. Zualkernan, and Fadi Aloul. 2016. Emotion recognition using mobile phones. In 2016 IEEE 18th International Conference on e-Health Networking, Applications and Services (Healthcom). IEEE, New York, NY, USA, 1--6. https://doi.org/10.1109/HealthCom.2016.7749470Google ScholarCross Ref
- Anna Ståhl, Kristina Höök, Martin Svensson, Alex S. Taylor, and Marco Combetto. 2009. Experiencing the Affective Diary. Personal and Ubiquitous Computing 13, 5 (01 Jun 2009), 365--378. https://doi.org/10.1007/s00779-008-0202-7Google ScholarDigital Library
- Myrthe Tielman, Mark Neerincx, John-Jules Meyer, and Rosemarijn Looije. 2014. Adaptive Emotional Expression in Robot-child Interaction. In Proceedings of the 2014 ACM/IEEE International Conference on Human-robot Interaction (Bielefeld, Germany) (HRI '14). ACM, New York, NY, USA, 407--414. https://doi.org/10.1145/2559636.2559663Google ScholarDigital Library
- Hitomi Tsujita and Jun Rekimoto. 2011. HappinessCounter: Smile-encouraging Appliance to Increase Positive Mood. In CHI '11 Extended Abstracts on Human Factors in Computing Systems (Vancouver, BC, Canada) (CHI EA '11). ACM, New York, NY, USA, 117--126. https://doi.org/10.1145/1979742.1979608Google ScholarDigital Library
- Beverly Woolf, Winslow Burleson, Ivon Arroyo, Toby Dragon, David Cooper, and Rosalind Picard. 2009. Affect-aware tutors: recognising and responding to student affect. International Journal of Learning Technology 4, 3--4 (2009), 129--164.Google ScholarDigital Library
- SungHyuk Yoon, Sang-su Lee, Jae-myung Lee, and KunPyo Lee. 2014. Understanding Notification Stress of Smartphone Messenger App. In CHI '14 Extended Abstracts on Human Factors in Computing Systems (Toronto, Ontario, Canada) (CHI EA '14). ACM, New York, NY, USA, 1735--1740. https://doi.org/10.1145/2559206.2581167Google ScholarDigital Library
- Tianyi Zhang, Abdallah El Ali, Chen Wang, Alan Hanjalic, and Pablo Cesar. 2020. RCEA: Real-Time, Continuous Emotion Annotation for Collecting Precise Mobile Video Ground Truth Labels. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (Honolulu, HI, USA) (CHI '20). Association for Computing Machinery, New York, NY, USA, 1--15. https://doi.org/10.1145/3313831.3376808Google ScholarDigital Library
Index Terms
- Emotions on the Go: Mobile Emotion Assessment in Real-Time using Facial Expressions
Recommendations
Emotions in context: examining pervasive affective sensing systems, applications, and analyses
Pervasive sensing has opened up new opportunities for measuring our feelings and understanding our behavior by monitoring our affective states while mobile. This review paper surveys pervasive affect sensing by examining and considering three major ...
MAPD: A Multi-subject Affective Physiological Database
ISCID '14: Proceedings of the 2014 Seventh International Symposium on Computational Intelligence and Design - Volume 02A multi-subject affective physiological database containing 380 physiological records of 250 subjects is presented in this paper. While the subjects individually watch the amusement, anger, fear, or sadness elicitation film clip, their oxygen saturation ...
Biometric valence and arousal recognition
OZCHI '07: Proceedings of the 19th Australasian conference on Computer-Human Interaction: Entertaining User InterfacesA real-time user-independent emotion detection system using physiological signals has been developed. The system has the ability to classify affective states into 2-dimensions using valence and arousal. Each dimension ranges from 1 to 5 giving a total ...
Comments