Abstract
Social Assistive Robots are starting to be widely used in paediatric health-care environments. In this domain, the development of effective strategies to keep the children engaged during the interaction with a social robot is still an open research area. On this subject, some approaches are investigating the combination of distraction strategies, as used in human-human interaction, and the display of emotional behaviours. In this study, we presented the results of a pilot study aimed to evaluate with children the valence of emotional behaviours enhanced with non-verbal sounds. The objective is to endow the NAO robot with emotional-like sounds, selected from a set of para-linguistic behaviours validated by valence. Results show that children aged 3–8 years perceive the robot’s behaviours and the related selected emotional semantic free sounds in terms of different degrees of arousal, valence and dominance: while valence and dominance are clearly perceived by the children, arousal is more difficult to distinguish.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The selected para-linguistic sounds are available for download at http://prisca.unina.it/demo/EmotionSounds.rar.
References
Alemi, M., Ghanbarzadeh, A., Meghdari, A., Moghadam, L.J.: Clinical application of a humanoid robot in pediatric cancer interventions. Int. J. Soc. Robot. 8(5), 743–759 (2016)
Anolli, L., Ciceri, R.: The voice of deception: vocal strategies of naive and able liars. J. Nonverbal Behav. 21(4), 259–284 (1997)
Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25(1), 49–59 (1994)
Breazeal, C.: Social robots for health applications. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 5368–5371. IEEE (2011)
Dawe, J., Sutherland, C., Barco, A., Broadbent, E.: Can social robots help children in healthcare contexts? A scoping review. BMJ Paediatr. Open 3(1), e000371 (2019)
Ekman, P., Friesen, W.: Emotion facial action coding system (EM-FACS). University of California, San Francisco (1984)
Erden, M.S.: Emotional postures for the humanoid-robot Nao. Int. J. Soc. Robot. 5(4), 441–456 (2013)
Fridin, M., Belokopytov, M.: Robotics agent coacher for CP motor function (RAC CP Fun). Robotica 32(8), 1265–1279 (2014)
Harrigan, J.A.: Proxemics, kinesics, and gaze. In: The New Handbook of Methods in Nonverbal Behavior Research, pp. 137–198 (2005)
Jeong, S., Logan, D.E., Goodwin, M.S., et al.: A social robot to mitigate stress, anxiety, and pain in hospital pediatric care. In: Proceedings of HRI - Extended Abstracts, pp. 103–104. ACM (2015)
Jürgens, R., Fischer, J., Schacht, A.: Hot speech and exploding bombs: autonomic arousal during emotion classification of prosodic utterances and affective sounds. Front. Psychol. 9, 228 (2018)
Klasmeyer, G., Sendlmeier, W.F.: The classification of different phonation types in emotional and neutral speech. Int. J. Speech Lang. Law 4(1), 104–124 (2013)
Kurdi, B., Lozano, S., Banaji, M.R.: Introducing the open affective standardized image set (OASIS). Behav. Res. Methods 49(2), 457–470 (2017)
Libin, A.V., Libin, E.V.: Person-robot interactions from the robopsychologists’ point of view: the robotic psychology and robotherapy approach. Proc. IEEE 92(11), 1789–1803 (2004)
Lopez, M.: Estimation of Cronbach’s alpha for sparse datasets. In: Proceedings of the 20th Annual Conference of the National Advisory Committee on Computing Qualifications (NACCQ), pp. 151–155 (2007)
Mehrabian, A., Russell, J.A.: An Approach to Environmental Psychology. MIT Press, Cambridge (1974)
Pittman, J., Scherer, K., Lewis, M., Haviland-Jones, J.: Vocal expression and communication of emotions. In: Handbook of Emotions, pp. 185–197 (1993)
Plutchik, R.: The nature of emotions: human emotions have deep evolutionary roots, a fact that may explain their complexity and provide tools for clinical practice. Am. Sci. 89(4), 344–350 (2001)
Rossi, S., Cimmino, T., Matarese, M., Raiano, M.: Coherent and incoherent robot emotional behavior for humorous and engaging recommendations. In: 28th IEEE RO-MAN, October 2019
Rossi, S., Ruocco, M.: Better alone than in bad company: Effects of incoherent non-verbal emotional cues for a humanoid robot. Interac. Stud. (2019, to appear)
Rossi, S., Staffa, M., Tamburro, A.: Socially assistive robot for providing recommendations: comparing a humanoid robot with a mobile application. Int. J. Soc. Robot. 10(2), 265–278 (2018)
Russell, J.A., Barrett, L.F.: Core affect, prototypical emotional episodes, and other things called emotion: dissecting the elephant. J. Pers. Soc. Psychol. 76(5), 805 (1999)
Russell, J.A., Ward, L.M., Pratt, G.: Affective quality attributed to environments: a factor analytic study. Environ. Behav. 13(3), 259–288 (1981)
Sauter, D.A., Eimer, M.: Rapid detection of emotion from human vocalizations. J. Cogn. Neurosci. 22(3), 474–481 (2010)
Simoës-Perlant, A., Lemercier, C., Pêcher, C., Benintendi-Medjaoued, S.: Mood self-assessment in children from the age of 7. Euro. J. Psychol. 14(3), 599 (2018)
Soares, A.P., Pinheiro, A.P., Costa, A., Frade, C.S., Comesaña, M., Pureza, R.: Affective auditory stimuli: adaptation of the international affective digitized sounds for European Portuguese. Behav. Res. Meth. 45(4), 1168–1181 (2013)
Tielman, M., Neerincx, M., Meyer, J.J., Looije, R.: Adaptive emotional expression in robot-child interaction. In: Proceedings of HRI, pp. 407–414. ACM (2014)
Tsiourti, C., Weiss, A., Wac, K., Vincze, M.: Multimodal integration of emotional signals from voice, body, and context: effects of (in) congruence on emotion recognition and attitudes towards robots. Int. J. Soc. Robot. 11, 555–573 (2019)
Yilmazyildiz, S., Henderickx, D., Vanderborght, B., Verhelst, W., Soetens, E., Lefeber, D.: Multi-modal emotion expression for affective human-robot interaction. In: Proceedings of the Workshop on Affective Social Speech Signals (2013)
Yilmazyildiz, S., Read, R., Belpeame, T., Verhelst, W.: Review of semantic-free utterances in social human-robot interaction. Int. J. Hum. Comput. Interact. 32(1), 63–85 (2016)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Rossi, S., Dell’Aquila, E., Bucci, B. (2019). Evaluating the Emotional Valence of Affective Sounds for Child-Robot Interaction. In: Salichs, M., et al. Social Robotics. ICSR 2019. Lecture Notes in Computer Science(), vol 11876. Springer, Cham. https://doi.org/10.1007/978-3-030-35888-4_47
Download citation
DOI: https://doi.org/10.1007/978-3-030-35888-4_47
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-35887-7
Online ISBN: 978-3-030-35888-4
eBook Packages: Computer ScienceComputer Science (R0)