Skip to main content
Log in

It’s in the Eyes: The Engaging Role of Eye Contact in HRI

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

This paper reports a study where we examined how a humanoid robot was evaluated by users, dependent on established eye contact. In two experiments, the robot was programmed to either establish eye contact with the user, or to look elsewhere. Across the experiments, we altered the level of predictiveness of the robot’s gaze direction with respect to a subsequent target stimulus (in Exp.1 the gaze direction was non-predictive, in Exp. 2 it was counter-predictive). Results of subjective reports showed that participants were sensitive to eye contact. Moreover, participants felt more engaged with the robot when it established eye contact, and the majority attributed higher degree of human-likeness in the eye contact condition, relative to no eye contact. This was independent of predictiveness of the gaze cue. Our results suggest that establishing eye contact by embodied humanoid robots has a positive impact on perceived socialness of the robot, and on the quality of human–robot interaction (HRI). Therefore, establishing eye contact should be considered in designing robot behaviors for social HRI.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. https://github.com/robotology/iCub-main/tree/master/src/modules/iKinGazeCtrl.

  2. [https://github.com/robotology/human-sensing.

  3. http://dlib.net.

  4. https://sourceforge.net/p/dclib/wiki/Known_users/.

References

  1. Takayama L, Ju W, Nass C (2008) Beyond dirty, dangerous and dull: what everyday people think robots should do. In: Proceedings of the 3rd ACM/IEEE, international conference on human robot interaction. Amsterdam, pp 25–32

  2. Tapus A, Matarić MJ (2006) Towards socially assistive robotics. Int J Robot Soc Jpn 24:576–578

    Article  Google Scholar 

  3. Cabibihan JJ, Javed H, Ang M, Aljunied SM (2013) Why robots? A survey on the roles and benefits of social robots in the therapy of children with autism. Int J Soc Robot 5:593–618

    Article  Google Scholar 

  4. Martin RF, Carlos AD, Jose Maria CP, Gonzalo AD, Raul BM, Rivero S et al (2013) Robots in therapy for dementia patients. J Phys Agents 7:49–56

    Google Scholar 

  5. Mubin O, Stevens CJ, Shadid S, Mahmud A, Dong JJ (2013) A review of the applicability of robots in education. Technol Educ Learn 1:1–7

    Article  Google Scholar 

  6. Tapus A, Mataric MJ, Scasselati B (2007) Socially assistive robotics [grand challenges of robotics]. IEEE Robot Autom Mag 14:35–42

    Article  Google Scholar 

  7. Birks M, Bodak M, Barlas J, Harwood J, Pether M (2016) Robotic seals as therapeutic tools in an aged care facility: a qualitative study. J Aging Res, ID, p 8569602

    Google Scholar 

  8. Baron-Cohen S, Wheelwright S, Jolliffe T (1997) Is there a“language of the eyes”? Evidence from normal adults, and adults with autism or Asperger syndrome. Vis Cogn 4(3):311–331

    Article  Google Scholar 

  9. Baron-Cohen S, Campbell R (1995) Are children with autism blind to the mentalistic significance of the eyes? Br J Dev Psychol 13(4):379–398

    Article  Google Scholar 

  10. Dovidio J, Ellyson S (1982) Decoding visual dominance: attributions of power based on relative percentages of looking while speaking and looking while listening. Soc Psychol Q 45(2):106–113

    Article  Google Scholar 

  11. Kleinke CL (1986) Gaze and eye contact: a research review. Psychol Bull 100(1):78–100

    Article  Google Scholar 

  12. Kampe KK, Frith CD, Frith U (2003) “Hey John”: signals conveying communicative intention toward the self activate brain regions associated with “mentalizing”, regardless of modality. J Neurosci 23:5258–5263

    Article  Google Scholar 

  13. Argyle M, Cook M (1976) Gaze and mutual gaze. Cambridge University Press, Cambridge, UK

    Google Scholar 

  14. Macrae CN, Hood BM, Milne AB, Rowe AC, Mason MF (2002) Are you looking at me? Eye gaze and person perception. Psychol Sci 13(5):460–464

    Article  Google Scholar 

  15. Hamilton AFC (2016) Gazing at me: the importance of social meaning in understanding direct-gaze cues. Philos Trans R Soc B 371(1686):20150080

    Article  Google Scholar 

  16. Senju A, Johnson M (2009) The eye contact effect: mechanisms and development. Trends Cogn Sci 13(3):127–134

    Article  Google Scholar 

  17. Farroni T, Csibra G, Simion F, Johnson MH (2002) Eye contact detection in humans from birth. Proc Natl Acad Sci 99(14):9602–9605

    Article  Google Scholar 

  18. Batki A, Baron-Cohen S, Wheelwright S, Connellan J, Ahluwalia J (2000) Is there an innate gaze module? Evidence from human neonates. Infant Behav Dev 23(2):223–229

    Article  Google Scholar 

  19. Farroni T, Mansfield EM, Lai C, Johnson MH (2003) Infants perceiving and acting on the eyes: tests of an evolutionary hypothesis. J Exp Child Psychol 85(3):199–212

    Article  Google Scholar 

  20. Senju A, Csibra G (2008) Gaze following in human infants depends on communicative signals. Curr Biol 18(9):668–671

    Article  Google Scholar 

  21. Kompatsiari K, Ciardo F, Tikhanoff V, Metta G, Wykowska A (2018) On the role of eye contact in gaze cueing. Sci Rep 8(1):17842

    Article  Google Scholar 

  22. Senju A, Hasegawa T (2005) Direct gaze captures visuospatial attention. Vis Cogn 12(1):127–144

    Article  Google Scholar 

  23. Hood BM, Macrae CN, Cole-Davies V, Dias M (2003) Eye remember you: the effects of gaze direction on face recognition in children and adults. Dev Sci 6(1):67–71

    Article  Google Scholar 

  24. Kuzmanovic B, Georgescu AL, Eickhoff SB, Shah NJ, Bente G, Fink GR, Vogeley K (2009) Duration matters: dissociating neural correlates of detection and evaluation of social gaze. Neuroimage 46(4):1154–1163

    Article  Google Scholar 

  25. Brooks CI, Church MA, Fraser L (1986) Effects of duration of eye contact on judgments of personality characteristics. J Soc Psychol 126:71–78

    Article  Google Scholar 

  26. Kompatsiari K, Tikhanoff V, Ciardo F, Metta G, Wykowska A (2017) The importance of mutual gaze in human-robot interaction. In: International conference on social robotics. Tsukuba. Springer, Cham, pp 443–452

  27. Droney JM, Brooks CI (1993) Attributions of self-esteem as a function of duration of eye contact. J Soc Psychol 133:715–722

    Article  Google Scholar 

  28. Knackstedt G, Kleinke CL (1991) Eye contact, gender, and personality judgments. J Soc Psychol 131:303–304

    Article  Google Scholar 

  29. Mason MF, Tatkow EP, Macrae CN (2005) The look of love: gaze shifts and person perception. Psychol Sci 16(3):236–239

    Article  Google Scholar 

  30. Conty L, Tijus C, Hugueville L, Coelho E, George N (2006) Searching for asymmetries in the detection of gaze contact versus averted gaze under different head views: a behavioural study. Spat Vis 19(6):529–545

    Article  Google Scholar 

  31. Yonezawa T, Yamazoe H, Utsumi A, Abe S (2007) Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking. In: Proceedings of the 9th international conference on multimodal interfaces. ACM, New York, pp 140–145

  32. Ito A, Hayakawa S, Terada T (2004) Why robots need body for mind communication-an attempt of eye-contact between human and robot. In: 13th IEEE international workshop on robot and human interactive communication, ROMAN 2004. IEEE, pp 473–478

  33. Choi JJ, Kim Y, Kwak SS (2013) Have you ever lied? The impacts of gaze avoidance on people’s perception of a robot. In: 8th ACM/IEEE international conference on human-robot interaction (HRI). IEEE, pp 105–106

  34. Zhang Y, Beskow J, Kjellström H (2017) Look but don’t stare: mutual gaze interaction in social robots. In: International conference on social robotics. Springer, Cham, pp 556–566

  35. Admoni H, Scassellati B (2017) Social eye gaze in human-robot interaction: a review. J Hum Robot Interact 6(1):25–63

    Article  Google Scholar 

  36. Metta G, Natale L, Nori F, Sandini G, Vernon D, Fadiga L, von Hofsten C, Rosander K, Lopes M, Santos-Victor J, Bernardino A, Montesano L (2010) The iCub humanoid robot: an open-systems platform for research in cognitive development. Neural Netw 23(8–9):1125–1134

    Article  Google Scholar 

  37. Natale L, Bartolozzi C, Pucci D, Wykowska A, Metta G (2017) The not-yet-finished story of building a robot child. Sci Robot 2(13):eaaq1026

    Article  Google Scholar 

  38. Metta G, Fitzpatrick P, Natale L (2006) YARP: yet another robot platform. Int J Adv Rob Syst 3(1):43–44

    Google Scholar 

  39. Roncone A, Pattacini U, Metta, Natale L (2016) A Cartesian 6-DoF gaze controller for humanoid robots. In: Proceedings of robotics: science and systems. AnnArbor, Michigan. https://doi.org/10.15607/rss.2016.xii.022

  40. Kazemi V, Sullivan J (2014) One millisecond face alignment with an ensemble of regression trees. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 1867–1874

  41. Sharma S, Shanmugasundaram K, Ramasamy SK (2016) FAREC—CNN based efficient face recognition technique using Dlib. In: International conference on advanced communication control and computing technologies, pp 192–195

  42. Feng ZH, Kittler J, Awais M, Huber P, Wu XJ (2017) Face detection, bounding box aggregation and pose estimation for robust facial landmark localisation in the wild. In: Proceedings of the IEEE conference on computer vision and pattern recognition workshops 2017, pp 160–169

  43. Valstar M, Gratch J, Schuller B, Ringeval F, Lalanne D, Torres Torres M, Scherer S, Stratou G, Cowie R, Avec PM (2016) Depression, mood, and emotion recognition workshop and challenge. In: Proceedings of the 6th international workshop on audio/visual emotion challenge. ACM, New York, pp 3–10

  44. Martinez B, Valstar MF, Jiang B, Pantic M (2017) Automatic analysis of facial actions: a survey. IEEE Trans Affect Comput. https://doi.org/10.1109/taffc.2017.2731763

    Article  Google Scholar 

  45. Matsuyama Y, Bhardwaj A, Zhao R, Romeo O, Akoju S, Cassell J (2016) Socially-aware animated intelligent personal assistant agent. In: Proceedings of the 17th annual meeting of the special interest group on discourse and dialogue, pp 224–227

  46. Wood E, Baltrušaitis T, Morency LP, Robinson P, Bulling A (2016) A 3D morphable eye region model for gaze estimation. In: European conference on computer vision. Springer, Cham, pp 297–313

  47. Nasir M, Jati A, Shivakumar PG, Nallan Chakravarthula S, Georgiou P (2016) Multimodal and multiresolution depression detection from speech and facial landmark features. In: Proceedings of the 6th international workshop on audio/visual emotion challenge. ACM, New York, pp 43–50

  48. Zhang X, Sugano Y, Bulling A (2017) Everyday eye contact detection using unsupervised gaze target discovery. In: Proceedings of the 30th annual ACM symposium on user interface software and technology. ACM, New York, pp 193–203

  49. Portalska KJ, Leferink A, Groen N, Fernandes H, Moroni L, van Blitterswijk C, de Boer J (2012) Endothelial differentiation of mesenchymal stromal cells. PLoS ONE 7(10):e46842

    Article  Google Scholar 

  50. Gould S (2012) DARWIN: a framework for machine learning and computer vision research and development. J Mach Learn Res 13:3533–3537

    MathSciNet  Google Scholar 

  51. Bartneck C, Kulić D, Croft E, Zoghbi S (2009) Measurement instruments for the anthropomorphism, animacy, likeability, perceived intelligence, and perceived safety of robots. Int J of Soc Robot 1(1):71–81

    Article  Google Scholar 

  52. Kajopoulos J, Wong AHY, Yuen AWC, Dung TA, Kee TY, Wykowska A (2015) Robot-assisted training of joint attention skills in children diagnosed with autism. In: Lecture notes in artificial intelligence, pp 296–305

  53. Schilbach L (2014) On the relationship of online and offline social cognition. Front Hum Neurosci 8:278

    Article  Google Scholar 

  54. Schilbach L, Timmermans B, Reddy V, Costall A, Bente G, Schlicht T, Vogeley K (2013) A second-person neuroscience in interaction. Behav Brain Sci 36(4):441–462

    Article  Google Scholar 

  55. Kompatsiari K, Pérez-Osorio J, De Tommaso D, Metta G, Wykowska A (2018) Neuroscientifically-grounded research for improved human-robot interaction. In: 2018 IEEE/RSJ international conference on intelligent robots and systems (IROS), Madrid. IEEE, pp 3403–3408

  56. Wykowska A, Chaminade T, Cheng G (2016) Embodied artificial agents for understanding human social cognition. Philos Trans R Soc B 371(1693):20150375

    Article  Google Scholar 

  57. Wiese E, Metta G, Wykowska A (2017) Robots as intentional agents: using neuroscientific methods to make robots appear more social. Front Psychol 8:1663

    Article  Google Scholar 

  58. Wykowska A, Kajopoulos J, Obando-Leitón M, Chauhan SS, Cabibihan JJ, Cheng G (2015) Humans are well tuned to detecting agents among non-agents: examining the sensitivity of human perception to behavioral characteristics of intentional systems. Int J Soc Robot 7(5):767–781

    Article  Google Scholar 

  59. Wykowska A, Kajopoulos J, Ramirez-Amaro K, Cheng G (2015) Autistic traits and sensitivity to human-like features of robot behavior. Interact Stud 16(2):219–248

    Article  Google Scholar 

  60. Wykowska A, Chellali R, Al-Amin MM, Müller HJ (2014) Implications of robot actions for human perception. How do we represent actions of the observed robots? Int J of Soc Robot 6(3):357–366

    Article  Google Scholar 

Download references

Acknowledgements

This Project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation program (Grant awarded to A. Wykowska, titled “InStance: Intentional Stance for Social Attunement. Grant agreement No: 715058).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kyveli Kompatsiari.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (MP4 8460 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kompatsiari, K., Ciardo, F., Tikhanoff, V. et al. It’s in the Eyes: The Engaging Role of Eye Contact in HRI. Int J of Soc Robotics 13, 525–535 (2021). https://doi.org/10.1007/s12369-019-00565-4

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-019-00565-4

Keywords

Navigation