Emotion Synchronization Method for Robot Facial Expression

Yushun Kajihara, Sripian Peeraya, Chen Feng, Midori Sugaya

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Nowadays, communication robots are becoming popular since they are actively used in both commercially and personally. Increasing empathy between human-robot can effectively enhance the positive impression. Empathy can be created by syncing human emotion with the robot expression. Emotion estimation can be done by analyzing controllable expressions like facial expression, or uncontrollable expression like biological signals. In this work, we propose the comparison of robot expression synchronization with estimated emotion based on either facial expression or biological signal. In order to find out which of the proposed methods yield the best impression, subjective impression rating is used in the experiment. From the result of the impression evaluation, we found that the robot’s facial expression synchronization using the synchronization based on periodical emotion value performs the best and best suitable for emotion estimated both from facial expression and biological signal.

Original languageEnglish
Title of host publicationHuman-Computer Interaction. Multimodal and Natural Interaction - Thematic Area, HCI 2020, Held as Part of the 22nd International Conference, HCII 2020, Proceedings
EditorsMasaaki Kurosu
PublisherSpringer
Pages644-653
Number of pages10
ISBN (Print)9783030490614
DOIs
Publication statusPublished - 2020
EventThematic Area on Human Computer Interaction, HCI 2020, held as part of the 22nd International Conference on Human-Computer Interaction, HCII 2020 - Copenhagen, Denmark
Duration: 2020 Jul 192020 Jul 24

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12182 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceThematic Area on Human Computer Interaction, HCI 2020, held as part of the 22nd International Conference on Human-Computer Interaction, HCII 2020
CountryDenmark
CityCopenhagen
Period20/7/1920/7/24

Keywords

  • Emotion estimation
  • Empathy
  • Robot facial expression

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint Dive into the research topics of 'Emotion Synchronization Method for Robot Facial Expression'. Together they form a unique fingerprint.

  • Cite this

    Kajihara, Y., Peeraya, S., Feng, C., & Sugaya, M. (2020). Emotion Synchronization Method for Robot Facial Expression. In M. Kurosu (Ed.), Human-Computer Interaction. Multimodal and Natural Interaction - Thematic Area, HCI 2020, Held as Part of the 22nd International Conference, HCII 2020, Proceedings (pp. 644-653). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 12182 LNCS). Springer. https://doi.org/10.1007/978-3-030-49062-1_44