A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals

Yuya Kurono, Sripian Peeraya, Feng Chen, Midori Sugaya

研究成果: Conference contribution

抄録

Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.

元の言語English
ホスト出版物のタイトルHuman-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings
編集者Masaaki Kurosu
出版者Springer Verlag
ページ133-142
ページ数10
ISBN(印刷物)9783030226428
DOI
出版物ステータスPublished - 2019 1 1
イベントThematic Area on Human Computer Interaction, HCI 2019, held as part of the 21st International Conference on Human-Computer Interaction, HCI International 2019 - Orlando, United States
継続期間: 2019 7 262019 7 31

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
11567 LNCS
ISSN(印刷物)0302-9743
ISSN(電子版)1611-3349

Conference

ConferenceThematic Area on Human Computer Interaction, HCI 2019, held as part of the 21st International Conference on Human-Computer Interaction, HCI International 2019
United States
Orlando
期間19/7/2619/7/31

Fingerprint

Robots
Experiments
Artificial intelligence
Cameras
Communication

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

これを引用

Kurono, Y., Peeraya, S., Chen, F., & Sugaya, M. (2019). A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals. : M. Kurosu (版), Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings (pp. 133-142). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 巻数 11567 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-030-22643-5_10

A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals. / Kurono, Yuya; Peeraya, Sripian; Chen, Feng; Sugaya, Midori.

Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings. 版 / Masaaki Kurosu. Springer Verlag, 2019. p. 133-142 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); 巻 11567 LNCS).

研究成果: Conference contribution

Kurono, Y, Peeraya, S, Chen, F & Sugaya, M 2019, A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals. : M Kurosu (版), Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 巻. 11567 LNCS, Springer Verlag, pp. 133-142, Thematic Area on Human Computer Interaction, HCI 2019, held as part of the 21st International Conference on Human-Computer Interaction, HCI International 2019, Orlando, United States, 19/7/26. https://doi.org/10.1007/978-3-030-22643-5_10
Kurono Y, Peeraya S, Chen F, Sugaya M. A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals. : Kurosu M, 編集者, Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings. Springer Verlag. 2019. p. 133-142. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-030-22643-5_10
Kurono, Yuya ; Peeraya, Sripian ; Chen, Feng ; Sugaya, Midori. / A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals. Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings. 編集者 / Masaaki Kurosu. Springer Verlag, 2019. pp. 133-142 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{3626ccd8e89b4b65a1ffe9f3d99dec1d,
title = "A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals",
abstract = "Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.",
keywords = "Biological signals, Emotion classification, Facial expression, Feeling, Robotics, Sympathy",
author = "Yuya Kurono and Sripian Peeraya and Feng Chen and Midori Sugaya",
year = "2019",
month = "1",
day = "1",
doi = "10.1007/978-3-030-22643-5_10",
language = "English",
isbn = "9783030226428",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "133--142",
editor = "Masaaki Kurosu",
booktitle = "Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings",

}

TY - GEN

T1 - A Preliminary Experiment on the Estimation of Emotion Using Facial Expression and Biological Signals

AU - Kurono, Yuya

AU - Peeraya, Sripian

AU - Chen, Feng

AU - Sugaya, Midori

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.

AB - Imagine the day that a robot would comfort you when you feel sad. In the field of artificial intelligence and robot engineering, there are many research regarding automatic classification of human emotion to enhance human-robot communication, especially for therapy. Generally, estimating emotions of people is based on information such as facial expression, eye-gazing direction, and behaviors that are expressed externally and the robot can observe through a camera and so on. However, there is some invisible information that cannot be expressed, or control not to express. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of this research is to compare the classified emotion based on two different sources: controllable and uncontrollable expression. The preliminary experiments show that our proposed method suggested that the classification of emotion from biological signals outperform the classification from facial expression.

KW - Biological signals

KW - Emotion classification

KW - Facial expression

KW - Feeling

KW - Robotics

KW - Sympathy

UR - http://www.scopus.com/inward/record.url?scp=85069749731&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85069749731&partnerID=8YFLogxK

U2 - 10.1007/978-3-030-22643-5_10

DO - 10.1007/978-3-030-22643-5_10

M3 - Conference contribution

AN - SCOPUS:85069749731

SN - 9783030226428

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 133

EP - 142

BT - Human-Computer Interaction. Recognition and Interaction Technologies - Thematic Area, HCI 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Proceedings

A2 - Kurosu, Masaaki

PB - Springer Verlag

ER -