Impression detection and management using an embodied conversational agent

Loading...
Thumbnail Image
Files
HCII2020.pdf(2.63 MB)
Accepted version
Date
2020-07-10
Authors
Wang, Chen
Biancardi, Beatrice
Mancini, Maurizio
Cafaro, Angelo
Pelachaud, Catherine
Pun, Thierry
Chanel, Guillaume
Journal Title
Journal ISSN
Volume Title
Publisher
Springer
Research Projects
Organizational Units
Journal Issue
Abstract
Embodied Conversational Agents (ECAs) are a promising medium for human-computer interaction, since they are capable of engaging users in real-time face-to-face interaction [1, 2]. Users’ formed impressions of an ECA (e.g. favour or dislike) could be reflected behaviourally [3, 4]. These impressions may affect the interaction and could even remain afterwards [5, 7]. Thus, when we build an ECA to impress users, it is important to detect how users feel about the ECA. The impression the ECA leaves can then be adjusted by controlling its non-verbal behaviour [7]. Motivated by the role of ECAs in interpersonal interaction and the state-of-the-art on affect recognition, we investigated three research questions: 1) which modality (facial expressions, eye movements, and physiological signals) reveals most of the formed impressions; 2) whether an ECA could leave a better impression by maximizing the impression it produces; 3) whether there are differences in impression formation during human-human vs. human-agent interaction. Our results firstly showed the interest to use different modalities to detect impressions. An ANOVA test indicated that facial expressions performance outperforms the physiological modality performance (M = 1.27, p = 0.02). Secondly, our results presented the possibility of creating an adaptive ECA. Compared with the randomly selected ECA behaviour, participants’ ratings tended to be higher in the conditions where the ECA adapted its behaviour based on the detected impressions. Thirdly, we found similar behaviour during human-human vs. human-agent interaction. People treated an ECA similarly to a human by spending more time observing the face area when forming an impression.
Description
Keywords
Affective computing , Eye gaze , Impression detection , Impression management , Machine learning , Reinforcement learning , Virtual agent
Citation
Wang, C., Biancardi, B., Mancini, M., Cafaro, A., Pelachaud, C., Pun, T. and Chanel, G. (2020), 'Impression Detection and Management Using an Embodied Conversational Agent'. Human-Computer Interaction. Multimodal and Natural Interaction, Lecture Notes in Computer Science book series, LNCS, vol. 12182, pp. 260-278. doi: 10.1007/978-3-030-49062-1_18
Copyright
© Springer Nature Switzerland AG 2020. This is a post-peer-review, pre-copyedit version of an article published in Lecture Notes in Computer Science. The final authenticated version is available online at: http://dx.doi.org/10.1007/978-3-030-49062-1_18