ABSTRACT
MACH--My Automated Conversation coacH--is a novel system that provides ubiquitous access to social skills training. The system includes a virtual agent that reads facial expressions, speech, and prosody and responds with verbal and nonverbal behaviors in real time. This paper presents an application of MACH in the context of training for job interviews. During the training, MACH asks interview questions, automatically mimics certain behavior issued by the user, and exhibit appropriate nonverbal behaviors. Following the interaction, MACH provides visual feedback on the user's performance. The development of this application draws on data from 28 interview sessions, involving employment-seeking students and career counselors. The effectiveness of MACH was assessed through a weeklong trial with 90 MIT undergraduates. Students who interacted with MACH were rated by human experts to have improved in overall interview performance, while the ratings of students in control groups did not improve. Post-experiment interviews indicate that participants found the interview experience informative about their behaviors and expressed interest in using MACH in the future.
Supplemental Material
- M. Schroder et al., "Building Autonomous Sensitive Artificial Listeners," IEEE Transactions on Affective Computing, vol. 3, no. 2, pp. 1--20, 2011. Google ScholarDigital Library
- J. Gratch et al., "Virtual Rapport," Intelligent Virtual Agents, vol. 4133, pp. 14--27, 2006. Google ScholarDigital Library
- R. El Kaliouby and P. Robinson, "Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures," in In the IEEE International Workshop on Real Time Computer Vision for Human Computer Interaction, CVPR, 2004. Google ScholarDigital Library
- G. Littlewort et al., "The computer expression recognition toolbox (CERT)," in IEEE International Conference on Automatic Face Gesture Recognition and Workshops FG, 2011, pp. 298--305.Google Scholar
- C. Smith, "Using Social Stories to Enhance Behaviour in Children with Autistic Spectrum Difficulties," Educational Psychology in Practice, vol. 17, no. 4, pp. 337--345, 2001.Google ScholarCross Ref
- M. Solomon, B. L. Goodlin-Jones, and T. F. Anders, "A social adjustment enhancement intervention for high functioning autism, Asperger's syndrome, and pervasive developmental disorder NOS,"Journal of autism and developmental disorders, vol. 34, no. 6, pp. 649--668, 2004.Google ScholarCross Ref
- M. H. Charlop-christy and S. E. Kelso, "Teaching Children With Autism Conversational Speech Using a Cue Card / Written Script Program," Children, vol. 26, no. 2, pp. 108--127, 2003.Google Scholar
- C. B. Denning, "Social Skills Interventions for Students With Asperger Syndrome and High-Functioning Autism : Research Findings and Implications for Teachers," Beyond Behavior, vol. 16, no. 3, pp. 16--24, 2007.Google Scholar
- C. Beard, "Cognitive bias modification for anxiety: current evidence and future directions.," Expert Review of Neurotherapeutics, vol. 11, no. 2, pp. 299--311, 2011.Google ScholarCross Ref
- "HireVue." {Online}. Available: hirevue.com/. {Accessed: 06-Nov-2013}.Google Scholar
- M. LaFrance, Lip Service. W. W. Norton & Company, 2011.Google Scholar
- C. Nass, Y. Moon, and N. Green, "Are Machines Gender Neutral? Gender-Stereotypic Responses to Computers With Voices," Journal of Applied Social Psychology, vol. 27, no. 10, pp. 864--876, 1997.Google ScholarCross Ref
- P. Ekman and W. Friesen, "Facial Action Coding System: A Technique for the Measurement of Facial Movement". Palo Alto:, 1978.Google Scholar
- M. Mori, "The Uncanny Valley," Energy, vol. 7, no. 4, pp. 33--35, 1970.Google Scholar
- J. Cassell, "Embodied conversational agents: representation and intelligence in user interfaces," AI Magazine, vol. 22, no. 4, pp. 67--84, 2001. Google ScholarDigital Library
- B. Mutlu, "Designing Embodied Cues for Dialog with Robots," AI Magazine, vol. 32, no. 4, pp. 17--30, 2011.Google ScholarDigital Library
- J. Cassell, "More than just a pretty face: conversational protocols and the affordances of embodiment," Knowledge-Based Systems, vol. 14, no. 1-2, pp. 55--64, 2001.Google ScholarDigital Library
- J. Cassell and A. Tartaro, "Intersubjectivity in human-agent interaction," Interaction Studies, vol. 3, pp. 391--410, 2007.Google ScholarCross Ref
- T. Rosenberger, "PROSODIC FONT: the Space between the Spoken and the Written," Massachusetts Institute of Technology, 1998.Google Scholar
- "Blue Planet Public Speaking." {Online}. Available: http://www.blueplanet.org/. {Accessed: 6-June-2013}.Google Scholar
- D. M. Clark, "A cognitive perspective on social phobia," in International Handbook of Social Anxiety Concepts Research and Interventions Relating to the Self and Shyness, vol. 42, no. 1, W. R. Crozier and L. E. Alden, Eds. John Wiley & Sons Ltd, 2001, pp. 405--430.Google Scholar
- B. Froba and A. Ernst, "Face detection with the modified census transform," in Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004, pp. 91--96. Google ScholarDigital Library
- T. Kanade, J. F. Cohn, and Y. T. Y. Tian, "Comprehensive database for facial expression analysis," in Fourth IEEE International Conference on Automatic Face and Gesture Recognition, 2000, pp. 46--53. Google ScholarDigital Library
- M. Lyons, S. Akamatsu, M. Kamachi, and J. Gyoba, "Coding facial expressions with Gabor wavelets," in Third IEEE International Conference on Automatic Face and Gesture Recognition, 1998, pp. 200--205. Google ScholarDigital Library
- S. Kawato et al., "Real-time detection of nodding and head-shaking by directly detecting and tracking the 'between-eyes'," in Fourth IEEE International Conference on Automatic Face and Gesture Recognition, 2000, pp. 40--45. Google ScholarDigital Library
- P. Boersma and D. Weenink, "Praat: doing phonetics by computer {Computer program}." {Online}. Available: www.praat.org. {Accessed: 21-Jun-2013}.Google Scholar
- "Nuance Communications." {Online}. Available: http://www.nuance.com/for-developers/dragon/index.htm. {Accessed: 28-Jun-2013}.Google Scholar
- M. Courgeon et al.,"Impact of Expressive Wrinkles on Perception of a Virtual Character's Facial Expressions of Emotions," in Proceedings of the 9th International Conference on Intelligent Virtual Agents, 2009, pp. 201--214. Google ScholarDigital Library
- "CereProc." {Online}. Available: http://cereproc.com. {Accessed: 28-Jun-2013}.Google Scholar
- J. Brooke, "SUS - A quick and dirty usability scale," Usability evaluation in industry, pp. 189--194, 1996.Google Scholar
Index Terms
- MACH: my automated conversation coach
Recommendations
The Effects of Interrupting Behavior on Interpersonal Attitude and Engagement in Dyadic Interactions
AAMAS '16: Proceedings of the 2016 International Conference on Autonomous Agents & Multiagent SystemsInterruptions frequently occur in dyadic human interaction. In addition to serve as turn-taking mechanism, they may lead to different perceptions of both the interruptee and interrupter's interpersonal attitude, engagement and involvement. We present an ...
Tailoring coaching strategies to users’ motivation in a multi-agent health coaching application
AbstractEmbodied conversational agents are often included in health behaviour change applications as intelligent virtual coaches. A major challenge in their development is tailoring coaching dialogues to user profiles. Agents should collect ...
Highlights- Tailoring of coaching strategies is explored in a multi-agent application.
- ...
No Joke: An Embodied Conversational Agent Greeting Older Adults with Humour or a Smile Unrelated to Initial Acceptance
CHI EA '24: Extended Abstracts of the CHI Conference on Human Factors in Computing SystemsEmbodied conversation agents (ECAs) are increasingly being developed for older adults as assistants or companions. Older adults may not be familiar with ECAs, influencing uptake and acceptability. First impressions can correlate strongly with subsequent ...
Comments