Estimate emotion method to use biological, symbolic information preliminary experiment

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Imagine the day that a robot would comfort you when you feel sad. To achieve the ability to estimate emotion and feeling, a lot of work has been done in the field of artificial intelligence [1] and robot engineering that focuses on human robot communications, especially where it applies to therapy [2, 3]. Generally, estimating emotions of people is based on expressed information such as facial expression, eye-gazing direction and behaviors that are observable by the robot [4–6]. However, sometimes this information would not be suitable, as some people do not express themselves with observable information. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of our proposal is to use biological information for estimating the actual emotion of people. The preliminary experiments show that our suggested method will outperform the traditional method, for the people who cannot expressed emotion directly.

Original languageEnglish
Title of host publicationFoundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience - 10th International Conference, AC 2016 and Held as Part of HCI International 2016, Proceedings
PublisherSpringer Verlag
Pages332-340
Number of pages9
Volume9743
ISBN (Print)9783319399546
DOIs
Publication statusPublished - 2016
Event10th International Conference on Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience, AC 2016 and Held as Part of 18th International Conference on Human-Computer Interaction, HCI International 2016 - Toronto, Canada
Duration: 2016 Jul 172016 Jul 22

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume9743
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other10th International Conference on Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience, AC 2016 and Held as Part of 18th International Conference on Human-Computer Interaction, HCI International 2016
CountryCanada
CityToronto
Period16/7/1716/7/22

Fingerprint

Robots
Experiments
Artificial intelligence
Communication

Keywords

  • Biological information
  • Estimate emotion
  • Estimation
  • Feeling
  • Robotics application

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Cite this

Ikeda, Y., Okada, Y., & Sugaya, M. (2016). Estimate emotion method to use biological, symbolic information preliminary experiment. In Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience - 10th International Conference, AC 2016 and Held as Part of HCI International 2016, Proceedings (Vol. 9743, pp. 332-340). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 9743). Springer Verlag. https://doi.org/10.1007/978-3-319-39955-3_31

Estimate emotion method to use biological, symbolic information preliminary experiment. / Ikeda, Yuhei; Okada, Yoshiko; Sugaya, Midori.

Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience - 10th International Conference, AC 2016 and Held as Part of HCI International 2016, Proceedings. Vol. 9743 Springer Verlag, 2016. p. 332-340 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 9743).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ikeda, Y, Okada, Y & Sugaya, M 2016, Estimate emotion method to use biological, symbolic information preliminary experiment. in Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience - 10th International Conference, AC 2016 and Held as Part of HCI International 2016, Proceedings. vol. 9743, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 9743, Springer Verlag, pp. 332-340, 10th International Conference on Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience, AC 2016 and Held as Part of 18th International Conference on Human-Computer Interaction, HCI International 2016, Toronto, Canada, 16/7/17. https://doi.org/10.1007/978-3-319-39955-3_31
Ikeda Y, Okada Y, Sugaya M. Estimate emotion method to use biological, symbolic information preliminary experiment. In Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience - 10th International Conference, AC 2016 and Held as Part of HCI International 2016, Proceedings. Vol. 9743. Springer Verlag. 2016. p. 332-340. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-39955-3_31
Ikeda, Yuhei ; Okada, Yoshiko ; Sugaya, Midori. / Estimate emotion method to use biological, symbolic information preliminary experiment. Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience - 10th International Conference, AC 2016 and Held as Part of HCI International 2016, Proceedings. Vol. 9743 Springer Verlag, 2016. pp. 332-340 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{5105d5558b264a098cb6457341c0757e,
title = "Estimate emotion method to use biological, symbolic information preliminary experiment",
abstract = "Imagine the day that a robot would comfort you when you feel sad. To achieve the ability to estimate emotion and feeling, a lot of work has been done in the field of artificial intelligence [1] and robot engineering that focuses on human robot communications, especially where it applies to therapy [2, 3]. Generally, estimating emotions of people is based on expressed information such as facial expression, eye-gazing direction and behaviors that are observable by the robot [4–6]. However, sometimes this information would not be suitable, as some people do not express themselves with observable information. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of our proposal is to use biological information for estimating the actual emotion of people. The preliminary experiments show that our suggested method will outperform the traditional method, for the people who cannot expressed emotion directly.",
keywords = "Biological information, Estimate emotion, Estimation, Feeling, Robotics application",
author = "Yuhei Ikeda and Yoshiko Okada and Midori Sugaya",
year = "2016",
doi = "10.1007/978-3-319-39955-3_31",
language = "English",
isbn = "9783319399546",
volume = "9743",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "332--340",
booktitle = "Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience - 10th International Conference, AC 2016 and Held as Part of HCI International 2016, Proceedings",

}

TY - GEN

T1 - Estimate emotion method to use biological, symbolic information preliminary experiment

AU - Ikeda, Yuhei

AU - Okada, Yoshiko

AU - Sugaya, Midori

PY - 2016

Y1 - 2016

N2 - Imagine the day that a robot would comfort you when you feel sad. To achieve the ability to estimate emotion and feeling, a lot of work has been done in the field of artificial intelligence [1] and robot engineering that focuses on human robot communications, especially where it applies to therapy [2, 3]. Generally, estimating emotions of people is based on expressed information such as facial expression, eye-gazing direction and behaviors that are observable by the robot [4–6]. However, sometimes this information would not be suitable, as some people do not express themselves with observable information. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of our proposal is to use biological information for estimating the actual emotion of people. The preliminary experiments show that our suggested method will outperform the traditional method, for the people who cannot expressed emotion directly.

AB - Imagine the day that a robot would comfort you when you feel sad. To achieve the ability to estimate emotion and feeling, a lot of work has been done in the field of artificial intelligence [1] and robot engineering that focuses on human robot communications, especially where it applies to therapy [2, 3]. Generally, estimating emotions of people is based on expressed information such as facial expression, eye-gazing direction and behaviors that are observable by the robot [4–6]. However, sometimes this information would not be suitable, as some people do not express themselves with observable information. In this case, it is difficult to estimate the emotion even if the analysis technologies are sophisticated. The main idea of our proposal is to use biological information for estimating the actual emotion of people. The preliminary experiments show that our suggested method will outperform the traditional method, for the people who cannot expressed emotion directly.

KW - Biological information

KW - Estimate emotion

KW - Estimation

KW - Feeling

KW - Robotics application

UR - http://www.scopus.com/inward/record.url?scp=84978909289&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84978909289&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-39955-3_31

DO - 10.1007/978-3-319-39955-3_31

M3 - Conference contribution

AN - SCOPUS:84978909289

SN - 9783319399546

VL - 9743

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 332

EP - 340

BT - Foundations of Augmented Cognition: Neuroergonomics and Operational Neuroscience - 10th International Conference, AC 2016 and Held as Part of HCI International 2016, Proceedings

PB - Springer Verlag

ER -