Skip to main content

Evaluating the Emotional Valence of Affective Sounds for Child-Robot Interaction

  • Conference paper
  • First Online:
Social Robotics (ICSR 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11876))

Included in the following conference series:

Abstract

Social Assistive Robots are starting to be widely used in paediatric health-care environments. In this domain, the development of effective strategies to keep the children engaged during the interaction with a social robot is still an open research area. On this subject, some approaches are investigating the combination of distraction strategies, as used in human-human interaction, and the display of emotional behaviours. In this study, we presented the results of a pilot study aimed to evaluate with children the valence of emotional behaviours enhanced with non-verbal sounds. The objective is to endow the NAO robot with emotional-like sounds, selected from a set of para-linguistic behaviours validated by valence. Results show that children aged 3–8 years perceive the robot’s behaviours and the related selected emotional semantic free sounds in terms of different degrees of arousal, valence and dominance: while valence and dominance are clearly perceived by the children, arousal is more difficult to distinguish.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The selected para-linguistic sounds are available for download at http://prisca.unina.it/demo/EmotionSounds.rar.

References

  1. Alemi, M., Ghanbarzadeh, A., Meghdari, A., Moghadam, L.J.: Clinical application of a humanoid robot in pediatric cancer interventions. Int. J. Soc. Robot. 8(5), 743–759 (2016)

    Article  Google Scholar 

  2. Anolli, L., Ciceri, R.: The voice of deception: vocal strategies of naive and able liars. J. Nonverbal Behav. 21(4), 259–284 (1997)

    Article  Google Scholar 

  3. Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25(1), 49–59 (1994)

    Article  Google Scholar 

  4. Breazeal, C.: Social robots for health applications. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5368–5371. IEEE (2011)

    Google Scholar 

  5. Dawe, J., Sutherland, C., Barco, A., Broadbent, E.: Can social robots help children in healthcare contexts? A scoping review. BMJ Paediatr. Open 3(1), e000371 (2019)

    Article  Google Scholar 

  6. Ekman, P., Friesen, W.: Emotion facial action coding system (EM-FACS). University of California, San Francisco (1984)

    Google Scholar 

  7. Erden, M.S.: Emotional postures for the humanoid-robot Nao. Int. J. Soc. Robot. 5(4), 441–456 (2013)

    Article  Google Scholar 

  8. Fridin, M., Belokopytov, M.: Robotics agent coacher for CP motor function (RAC CP Fun). Robotica 32(8), 1265–1279 (2014)

    Article  Google Scholar 

  9. Harrigan, J.A.: Proxemics, kinesics, and gaze. In: The New Handbook of Methods in Nonverbal Behavior Research, pp. 137–198 (2005)

    Google Scholar 

  10. Jeong, S., Logan, D.E., Goodwin, M.S., et al.: A social robot to mitigate stress, anxiety, and pain in hospital pediatric care. In: Proceedings of HRI - Extended Abstracts, pp. 103–104. ACM (2015)

    Google Scholar 

  11. Jürgens, R., Fischer, J., Schacht, A.: Hot speech and exploding bombs: autonomic arousal during emotion classification of prosodic utterances and affective sounds. Front. Psychol. 9, 228 (2018)

    Article  Google Scholar 

  12. Klasmeyer, G., Sendlmeier, W.F.: The classification of different phonation types in emotional and neutral speech. Int. J. Speech Lang. Law 4(1), 104–124 (2013)

    Article  Google Scholar 

  13. Kurdi, B., Lozano, S., Banaji, M.R.: Introducing the open affective standardized image set (OASIS). Behav. Res. Methods 49(2), 457–470 (2017)

    Article  Google Scholar 

  14. Libin, A.V., Libin, E.V.: Person-robot interactions from the robopsychologists’ point of view: the robotic psychology and robotherapy approach. Proc. IEEE 92(11), 1789–1803 (2004)

    Article  Google Scholar 

  15. Lopez, M.: Estimation of Cronbach’s alpha for sparse datasets. In: Proceedings of the 20th Annual Conference of the National Advisory Committee on Computing Qualifications (NACCQ), pp. 151–155 (2007)

    Google Scholar 

  16. Mehrabian, A., Russell, J.A.: An Approach to Environmental Psychology. MIT Press, Cambridge (1974)

    Google Scholar 

  17. Pittman, J., Scherer, K., Lewis, M., Haviland-Jones, J.: Vocal expression and communication of emotions. In: Handbook of Emotions, pp. 185–197 (1993)

    Google Scholar 

  18. Plutchik, R.: The nature of emotions: human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am. Sci. 89(4), 344–350 (2001)

    Article  Google Scholar 

  19. Rossi, S., Cimmino, T., Matarese, M., Raiano, M.: Coherent and incoherent robot emotional behavior for humorous and engaging recommendations. In: 28th IEEE RO-MAN, October 2019

    Google Scholar 

  20. Rossi, S., Ruocco, M.: Better alone than in bad company: Effects of incoherent non-verbal emotional cues for a humanoid robot. Interac. Stud. (2019, to appear)

    Google Scholar 

  21. Rossi, S., Staffa, M., Tamburro, A.: Socially assistive robot for providing recommendations: comparing a humanoid robot with a mobile application. Int. J. Soc. Robot. 10(2), 265–278 (2018)

    Article  Google Scholar 

  22. Russell, J.A., Barrett, L.F.: Core affect, prototypical emotional episodes, and other things called emotion: dissecting the elephant. J. Pers. Soc. Psychol. 76(5), 805 (1999)

    Article  Google Scholar 

  23. Russell, J.A., Ward, L.M., Pratt, G.: Affective quality attributed to environments: a factor analytic study. Environ. Behav. 13(3), 259–288 (1981)

    Article  Google Scholar 

  24. Sauter, D.A., Eimer, M.: Rapid detection of emotion from human vocalizations. J. Cogn. Neurosci. 22(3), 474–481 (2010)

    Article  Google Scholar 

  25. Simoës-Perlant, A., Lemercier, C., Pêcher, C., Benintendi-Medjaoued, S.: Mood self-assessment in children from the age of 7. Euro. J. Psychol. 14(3), 599 (2018)

    Article  Google Scholar 

  26. Soares, A.P., Pinheiro, A.P., Costa, A., Frade, C.S., Comesaña, M., Pureza, R.: Affective auditory stimuli: adaptation of the international affective digitized sounds for European Portuguese. Behav. Res. Meth. 45(4), 1168–1181 (2013)

    Article  Google Scholar 

  27. Tielman, M., Neerincx, M., Meyer, J.J., Looije, R.: Adaptive emotional expression in robot-child interaction. In: Proceedings of HRI, pp. 407–414. ACM (2014)

    Google Scholar 

  28. Tsiourti, C., Weiss, A., Wac, K., Vincze, M.: Multimodal integration of emotional signals from voice, body, and context: effects of (in) congruence on emotion recognition and attitudes towards robots. Int. J. Soc. Robot. 11, 555–573 (2019)

    Article  Google Scholar 

  29. Yilmazyildiz, S., Henderickx, D., Vanderborght, B., Verhelst, W., Soetens, E., Lefeber, D.: Multi-modal emotion expression for affective human-robot interaction. In: Proceedings of the Workshop on Affective Social Speech Signals (2013)

    Google Scholar 

  30. Yilmazyildiz, S., Read, R., Belpeame, T., Verhelst, W.: Review of semantic-free utterances in social human-robot interaction. Int. J. Hum. Comput. Interact. 32(1), 63–85 (2016)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Silvia Rossi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Rossi, S., Dell’Aquila, E., Bucci, B. (2019). Evaluating the Emotional Valence of Affective Sounds for Child-Robot Interaction. In: Salichs, M., et al. Social Robotics. ICSR 2019. Lecture Notes in Computer Science(), vol 11876. Springer, Cham. https://doi.org/10.1007/978-3-030-35888-4_47

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-35888-4_47

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-35887-7

  • Online ISBN: 978-3-030-35888-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics