Skip to main content

Comparing the Performance of Facial Emotion Recognition Systems on Real-Life Videos: Gender, Ethnicity and Age

  • Conference paper
  • First Online:
Proceedings of the Future Technologies Conference (FTC) 2021, Volume 1 (FTC 2021)

Part of the book series: Lecture Notes in Networks and Systems ((LNNS,volume 358))

Included in the following conference series:

Abstract

Dealing with non-verbal communications will be a key breakthrough for future technologies as much of the effort of the 21st century technologies has been in dealing with numbers and verbal communications. The automatic recognition of facial expressions is of theoretical and commercial interests and to this end there must exist video databases that incorporate the idiosyncrasies of human existence – ethnicity, gender and age. We compare the performance of three major emotion recognition software systems on real life videos of politicians from across the world. Our sample of 45 videos (total length of 2 h 26 min, with 219150 frames) is composed of male and female politicians ranging in age from 40 to 78 with well-defined differences related to gender and nationality/ethnicity. Our sample of images are partially posed and partially spontaneous – the demeanour of politicians when they engage in speech making. Our target systems, Micorosoft Azure Cognitive Services Face API, Affectiva AFFDEX and Emotient FACET, have been trained on posed expressions usually, with limited testing on spontaneous images, so in effect we are operating at the edge of the performance of these systems. There are similarities in the performance of these systems on some emotions, especially joy, but there are differences in emotion recognition, such as anger. There are also gender differences as well as differences based on age and race. This is an important issue as more and more video data is becoming available and video analytics that can deal with aspects of cognition, like emotion, accurately and across cultural/gender/ethnic divides will be a major component of future technologies.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Al-Omair, O.M., Huang, S.: A comparative study on detection accuracy of cloud-based emotion recognition services. Paper Presented at the Proceedings of the 2018 International Conference on Signal Processing and Machine Learning, pp. 142–148 (2018). https://doi.org/10.1145/3297067.3297079

  2. Andresen, N., et al.: Towards a fully automated surveillance of well-being status in laboratory mice using deep learning: Starting with facial expression analysis. PLoS ONE 15(4), e0228059 (2020)

    Article  Google Scholar 

  3. Barrett, L.F., Adolphs, R., Marsella, S., Martinez, A.M., Pollak, S.D.: Emotional expressions reconsidered: Challenges to inferring emotion from human facial movements. Psychol. Sci. Publ. Int. 20(1), 1–68 (2019)

    Article  Google Scholar 

  4. Bartlett, M.S., Hager, J.C., Ekman, P., Sejnowski, T.J.: Measuring facial expressions by computer image analysis. Psychophysiology 36(2), 253–263 (1999)

    Article  Google Scholar 

  5. Bartlett, M.S., Littlewort-Ford, G., Movellan, J., Fasel, I., Frank, M.: Automated facial action coding system - US Patent US 8,798,374 B2 (2014)

    Google Scholar 

  6. Bartlett, M.S., Littlewort, G., Frank, M.G., Lainscsek, C., Fasel, I.R., Movellan, J.R.: Automatic recognition of facial actions in spontaneous expressions. J. Multimed. 1(6), 22–35 (2006)

    Article  Google Scholar 

  7. Chan, D.W.: Perception and judgment of facial expressions among the Chinese. Int. J. Psychol. 20(3–4), 681–692 (1985)

    Article  Google Scholar 

  8. Cohn, J.F., Zlochower, A.J., Lien, J., Kanade, T.: Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding. Psychophysiology 36(1), 35–43 (1999)

    Article  Google Scholar 

  9. Dibeklioğlu, H., Salah, A.A., Gevers, T.: Recognition of genuine smiles. IEEE Trans. Multimedia 17(3), 279–294 (2015)

    Article  Google Scholar 

  10. Donato, G., Bartlett, M.S., Hager, J.C., Ekman, P., Sejnowski, T.J.: Classifying facial actions. IEEE Trans. Pattern Anal. Mach. Intell. 21(10), 974–989 (1999)

    Article  Google Scholar 

  11. Dupré, D., Krumhuber, E.G., Küster, D., McKeown, G.J.: A performance comparison of eight commercially available automatic classifiers for facial affect recognition. PLoS ONE 15(4), e0231968 (2020)

    Article  Google Scholar 

  12. Ekman, P., Friesen, W.V.: Nonverbal leakage and clues to deception. Psychiatry 32(1), 88–106 (1969)

    Article  Google Scholar 

  13. Ekman, P., Friesen, W.V., Ellsworth, P.: Emotion in the Human Face: Guidelines for Research and an Integration of Findings. Elsevier, Amsterdam (2013)

    Google Scholar 

  14. El Kaliouby, R., Robinson, P.: Mind reading machines: automated inference of cognitive mental states from video. Paper presented at the Systems, Man and Cybernetics, 2004 IEEE International Conference on IEEE Cat. No. 04CH37583, 2004, vol. 1, pp. 682–688  (2004). https://doi.org/10.1109/ICSMC.2004.1398380

  15. el Kaliouby, R., Robinson, P.: Generalization of a vision-based computational model of mind-reading. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 582–589. Springer, Heidelberg (2005). https://doi.org/10.1007/11573548_75

    Chapter  Google Scholar 

  16. Folgieri, R.: Brain computer interface and transcranial magnetic stimulation in legal practice and regulations. In: D’Aloia, A., Errigo, M.C. (eds.) Neuroscience and Law, pp. 273–290. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-38840-9_14

    Chapter  Google Scholar 

  17. Goeleven, E., De Raedt, R., Leyman, L., Verschuere, B.: The Karolinska directed emotional faces: a validation study. Cogn. Emot. 22(6), 1094–1118 (2008)

    Article  Google Scholar 

  18. Gong, S., Loy, C.C., Xiang, T.: Security and Surveillance. In: Visual Analysis of Humans, pp. 455–472. Springer, London (2011). https://doi.org/10.1007/978-0-85729-997-0_23

  19. Humphrey, R.H.: The many faces of emotional leadership. Leadersh. Q. 13(5), 493–504 (2002)

    Article  Google Scholar 

  20. Jack, R.E., Sun, W., Delis, I., Garrod, O.G., Schyns, P.G.: Four not six: revealing culturally common facial expressions of emotion. J. Exp. Psychol. Gen. 145(6), 708 (2016)

    Article  Google Scholar 

  21. Jilani, S.K., Ugail, H., Bukar, A.M., Logan, A., Munshi, T.: A machine learning approach for ethnic classification: the British Pakistani face. Paper Presented at the 2017 International Conference on Cyberworlds (CW), pp. 170–173 (2017). https://doi.org/10.1109/CW.2017.27

  22. Juslin, P.N., Scherer, K.R.: Vocal expression of affect. In: The New Handbook of Methods in Nonverbal Behavior Research, pp. 65–135 (2005)

    Google Scholar 

  23. Keating, C.F.: About face! facial status cues and perceptions of charismatic leadership. In: Senior, C. (ed.) The Facial Displays of Leaders, pp. 145–170. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-94535-4_7

    Chapter  Google Scholar 

  24. Krumhuber, E.G., Küster, D., Namba, S., Shah, D., Calvo, M.G.: Emotion recognition from posed and spontaneous dynamic expressions: Human observers versus machine analysis. Emotion 21(2), 447–451 (2021)

    Article  Google Scholar 

  25. Lin, Y.-C., Wang, M.-J.J., Wang, E.M.: The comparisons of anthropometric characteristics among four peoples in East Asia. Appl. Ergon. 35(2), 173–178 (2004)

    Article  Google Scholar 

  26. Littlewort, G., Whitehill, J., Wu, T., Fasel, I., Frank, M., Movellan, J., Bartlett, M.: The computer expression recognition toolbox (CERT). In: 2011 IEEE International Conference on Automatic Face & Gesture Recognition (FG), pp. 298–305 (2011). https://doi.org/10.1109/FG.2011.5771414

  27. Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended cohn-kanade dataset (ck+): a complete dataset for action unit and emotion-specified expression. Paper Presented at the Computer Vision and Pattern Recognition Workshops (CVPRW), 2010 IEEE Computer Society Conference on, pp. 94–101 (2010). https://doi.org/10.1109/CVPRW.2010.5543262

  28. Manfredonia, J., et al.: Automatic recognition of posed facial expression of emotion in individuals with autism spectrum disorder. J. Autism Dev. Disord. 49(1), 279–293 (2019)

    Article  Google Scholar 

  29. McDuff, D., Berger, J.: Why do some advertisements get shared more than others?: Quantifying facial expressions to gain new insights. J. Advert. Res. 60(4), 370–380 (2020)

    Article  Google Scholar 

  30. McDuff, D., Czerwinski, M.: Designing emotionally sentient agents. Commun. ACM 61(12), 74–83 (2018)

    Article  Google Scholar 

  31. McDuff, D., El Kaliouby, R.: Applications of automated facial coding in media measurement. IEEE Trans. Affect. Comput. 8(2), 148–160 (2017). https://doi.org/10.1109/TAFFC.2016.2571284

    Article  Google Scholar 

  32. McDuff, D., El Kaliouby, R., Picard, R.W.: Crowdsourcing facial responses to online videos. IEEE Trans. Affect. Comput. 3(4), 456–468 (2012)

    Article  Google Scholar 

  33. McDuff, D., El Kaliouby, R., Picard, R.W.: Crowdsourcing facialresponses to online videos. IEEETrans. Affect. Comput. 3(4), 456–468 (2012). https://doi.org/10.1109/T-AFFC.2012.19. Fourth Quarter

  34. McDuff, D., Mahmoud, A., Mavadati, M., Amr, M., Turcot, J., Kaliouby, R.E.: AFFDEX SDK: a cross-platform real-time multi-face expression recognition toolkit. Paper Presented at the Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 3723–3726 (2016). https://doi.org/10.1145/2851581.2890247

  35. Michel, P., El Kaliouby, R.: Real time facial expression recognition in video using support vector machines. Paper Presented at the Proceedings of the 5th International Conference on Multimodal Interfaces, pp. 258–264 (2003). https://doi.org/10.1145/958432.958479

  36. Mishra, M.V., Ray, S.B., Srinivasan, N.: Cross-cultural emotion recognition and evaluation of Radboud faces database with an Indian sample. PLoS ONE 13(10), e0203959 (2018)

    Article  Google Scholar 

  37. Motley, M.T., Camden, C.T.: Facial expression of emotion: a comparison of posed expressions versus spontaneous expressions in an interpersonal communication setting. Western J. Commun. (Includes Commun. Rep.) 52(1), 1–22 (1988)

    Google Scholar 

  38. Pfister, T., Li, X., Zhao, G., Pietikäinen, M.: Differentiating spontaneous from posed facial expressions within a generic facial expression recognition framework. Paper Presented at the 2011 IEEE International Conference on Computer Vision Workshops (ICCV Workshops), pp. 868–875 (2011). https://doi.org/10.1109/ICCVW.2011.6130343

  39. Raveendran, M.: The South Asian facial anthropometric profile: a systematic review. J. Cranio-Maxillof. Surg. 47(2), 263–272 (2019)

    Article  Google Scholar 

  40. Russell, J.A.: Is there universal recognition of emotion from facial expression? A review of the cross-cultural studies. Psychol. Bull. 115(1), 102 (1994)

    Article  Google Scholar 

  41. Russell, J.A., Dols, J.M.F.: The Psychology of Facial Expression, vol. 10. Cambridge University Press, Cambridge (1997)

    Book  Google Scholar 

  42. Scherer, K.R., Ekman, P.: Handbook of Methods in Nonverbal Behavior Research, vol. 2. Cambridge University Press, Cambridge (1982)

    Google Scholar 

  43. Slepian, M.L., Carr, E.W.: Facial expressions of authenticity: emotion variability increases judgments of trustworthiness and leadership. Cognition 183, 82–98 (2019)

    Article  Google Scholar 

  44. Söderlund, M., Sagfossen, S.: The depicted service employee in marketing communications: an empirical assessment of the impact of facial happiness. J. Retail. Consum. Serv. 38, 186–193 (2017)

    Article  Google Scholar 

  45. Sowden, S., Schuster, B.A., Keating, C.T., Fraser, D.S., Cook, J.L.: The Role of Movement Kinematics in Facial Emotion Expression Production and Recognition. Emotion (2021). doi: https://doi.org/10.1037/emo0000835. Epub ahead of print. PMID: 33661668

  46. Spyropoulou, M., Ahmad, K.: Disaster-related public speeches: the role of emotions. Paper Presented at the 2016 11th International Conference on Availability, Reliability and Security (ARES), pp. 800–804 (2016). https://doi.org/10.1109/ARES.2016.29

  47. Stöckli, S., Schulte-Mecklenbeck, M., Borer, S., Samson, A.C.: Facial expression analysis with AFFDEX and FACET: a validation study. Behav. Res. Methods 50(4), 1446–1460 (2017). https://doi.org/10.3758/s13428-017-0996-1

    Article  Google Scholar 

  48. Tcherkassof, A., Dupré, D.: The emotion–facial expression link: evidence from human and automatic expression recognition. Psychol. Res. 85, 2954–2969 (2021). https://doi.org/10.1007/s00426-020-01448-4

    Article  Google Scholar 

  49. Wang, L., Markham, R.: The development of a series of photographs of Chinese facial expressions of emotion. J. Cross Cult. Psychol. 30(4), 397–410 (1999)

    Article  Google Scholar 

  50. Warren, G., Schertler, E., Bull, P.: Detecting deception from emotional and unemotional cues. J. Nonverbal Behav. 33(1), 59–69 (2009)

    Article  Google Scholar 

  51. Willis, P.: Engaging communities: Ostrom’s economic commons, social capital and public relations. Publ. Relat. Rev. 38(1), 116–122 (2012)

    Article  Google Scholar 

  52. Zhuang, Z., Landsittel, D., Benson, S., Roberge, R., Shaffer, R.: Facial anthropometric differences among gender, ethnicity, and age groups. Ann. Occup. Hyg. 54(4), 391–402 (2010)

    Google Scholar 

  53. Elfenbein, H.A., Ambady, N.: On the universality and cultural specificity of emotion recognition: a meta-analysis. Psychol. Bull. 128(2), 203 (2002)

    Article  Google Scholar 

  54. Elfenbein, H.A., Luckman, E.A., Hall, J.A., Mast, M.S., West, T.V.: Interpersonal accuracy in relation to culture and ethnicity. In: Hall, J.A., Mast, M.S., West, T.V. (eds.) The Social Psychology of Perceiving Others Accurately, pp. 328–349. Cambridge University Press, Cambridge (2016)

    Chapter  Google Scholar 

  55. Nordström, H., Laukka, P., Thingujam, N.S., Schubert, E., Elfenbein, H.A.: Emotion appraisal dimensions inferred from vocal expressions are consistent across cultures: a comparison between Australia and India. Open Sci. 4(11), 170912 (2017)

    Google Scholar 

  56. Akoglu, H.: User’s guide to correlation coefficients. Turkish J. Emerg. Med. 18(3), 91–93 (2018)

    Article  Google Scholar 

Download references

Acknowledgment

Shirui Wang is grateful for the support of China Scholarship Council (CSC) and Trinity College Dublin, The University of Dublin for hosting her stay in Dublin, Ireland.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Khurshid Ahmad .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Ahmad, K., Wang, S., Vogel, C., Jain, P., O’Neill, O., Sufi, B.H. (2022). Comparing the Performance of Facial Emotion Recognition Systems on Real-Life Videos: Gender, Ethnicity and Age. In: Arai, K. (eds) Proceedings of the Future Technologies Conference (FTC) 2021, Volume 1. FTC 2021. Lecture Notes in Networks and Systems, vol 358. Springer, Cham. https://doi.org/10.1007/978-3-030-89906-6_14

Download citation

Publish with us

Policies and ethics