A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals

Yuya Kurono, Sripian Peeraya, Feng Chen, Midori Sugaya

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.

Original languageEnglish
Title of host publicationHuman-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings
EditorsMasaaki Kurosu
PublisherSpringer Verlag
Pages133-142
Number of pages10
ISBN (Print)9783030226428
DOIs
Publication statusPublished - 2019 Jan 1
EventThematic Area on Human Computer Interaction, HCI 2019, held as part of the 21st International Conference on Human-Computer Interaction, HCI International 2019 - Orlando, United States
Duration: 2019 Jul 262019 Jul 31

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11567 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceThematic Area on Human Computer Interaction, HCI 2019, held as part of the 21st International Conference on Human-Computer Interaction, HCI International 2019
CountryUnited States
CityOrlando
Period19/7/2619/7/31

Fingerprint

Robots
Experiments
Artificial intelligence
Cameras
Communication

Keywords

  • Biological signals
  • Emotion classification
  • Facial expression
  • Feeling
  • Robotics
  • Sympathy

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Kurono, Y., Peeraya, S., Chen, F., & Sugaya, M. (2019). A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals. In M. Kurosu (Ed.), Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings (pp. 133-142). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11567 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-030-22643-5_10

A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals. / Kurono, Yuya; Peeraya, Sripian; Chen, Feng; Sugaya, Midori.

Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings. ed. / Masaaki Kurosu. Springer Verlag, 2019. p. 133-142 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 11567 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Kurono, Y, Peeraya, S, Chen, F & Sugaya, M 2019, A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals. in M Kurosu (ed.), Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 11567 LNCS, Springer Verlag, pp. 133-142, Thematic Area on Human Computer Interaction, HCI 2019, held as part of the 21st International Conference on Human-Computer Interaction, HCI International 2019, Orlando, United States, 19/7/26. https://doi.org/10.1007/978-3-030-22643-5_10
Kurono Y, Peeraya S, Chen F, Sugaya M. A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals. In Kurosu M, editor, Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings. Springer Verlag. 2019. p. 133-142. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-030-22643-5_10
Kurono, Yuya ; Peeraya, Sripian ; Chen, Feng ; Sugaya, Midori. / A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals. Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings. editor / Masaaki Kurosu. Springer Verlag, 2019. pp. 133-142 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{3626ccd8e89b4b65a1ffe9f3d99dec1d,
title = "A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals",
abstract = "Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.",
keywords = "Biological signals, Emotion classification, Facial expression, Feeling, Robotics, Sympathy",
author = "Yuya Kurono and Sripian Peeraya and Feng Chen and Midori Sugaya",
year = "2019",
month = "1",
day = "1",
doi = "10.1007/978-3-030-22643-5_10",
language = "English",
isbn = "9783030226428",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "133--142",
editor = "Masaaki Kurosu",
booktitle = "Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings",

}

TY - GEN

T1 - A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals

AU - Kurono, Yuya

AU - Peeraya, Sripian

AU - Chen, Feng

AU - Sugaya, Midori

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.

AB - Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.

KW - Biological signals

KW - Emotion classification

KW - Facial expression

KW - Feeling

KW - Robotics

KW - Sympathy

UR - http://www.scopus.com/inward/record.url?scp=85069749731&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85069749731&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-22643-5_10

DO - 10.1007/978-3-030-22643-5_10

M3 - Conference contribution

AN - SCOPUS:85069749731

SN - 9783030226428

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 133

EP - 142

BT - Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings

A2 - Kurosu, Masaaki

PB - Springer Verlag

ER -