Emotional speech as an effective interface for people with special needs

Akemi Ishii, Nick Campbell, Michiaki Yasumura

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

The paper describes an application concept of an affective communication system for people with disabilities and elderly people, summarizes the universal nature of emotion and its vocal expression, and reports on the work on designing a corpus database of emotional speech for a speech synthesis in the proposed system. Three corpora of emotional speech (joy, anger and sadness) have been designed and tested for the use with CHATR, the concatenated speech synthesis system at ATR. Each text corpus was designed to bring out a speaker's emotion. The result of perceptual experiments was proved to be significant and so was the result of CHATR synthesized speech. This indicates that the subjects successfully identified the emotion types of the synthesized speech from implicit phonetic information and hence this study has proved the validity of using a corpus of emotional speech as a database for the concatenated speech synthesis system.

Original languageEnglish
Title of host publicationProceedings - 3rd Asia Pacific Computer Human Interaction, APCHI 1998
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages266-271
Number of pages6
ISBN (Print)0818683473, 9780818683473
DOIs
Publication statusPublished - 1998 Jan 1
Externally publishedYes
Event3rd Asia Pacific Computer Human Interaction, APCHI 1998 - Hayama-machi, Kanagawa, Japan
Duration: 1998 Jul 151998 Jul 17

Other

Other3rd Asia Pacific Computer Human Interaction, APCHI 1998
CountryJapan
CityHayama-machi, Kanagawa
Period98/7/1598/7/17

Fingerprint

Speech synthesis
Speech analysis
Communication systems
Experiments

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Computer Graphics and Computer-Aided Design

Cite this

Ishii, A., Campbell, N., & Yasumura, M. (1998). Emotional speech as an effective interface for people with special needs. In Proceedings - 3rd Asia Pacific Computer Human Interaction, APCHI 1998 (pp. 266-271). [704336] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/APCHI.1998.704336

Emotional speech as an effective interface for people with special needs. / Ishii, Akemi; Campbell, Nick; Yasumura, Michiaki.

Proceedings - 3rd Asia Pacific Computer Human Interaction, APCHI 1998. Institute of Electrical and Electronics Engineers Inc., 1998. p. 266-271 704336.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ishii, A, Campbell, N & Yasumura, M 1998, Emotional speech as an effective interface for people with special needs. in Proceedings - 3rd Asia Pacific Computer Human Interaction, APCHI 1998., 704336, Institute of Electrical and Electronics Engineers Inc., pp. 266-271, 3rd Asia Pacific Computer Human Interaction, APCHI 1998, Hayama-machi, Kanagawa, Japan, 98/7/15. https://doi.org/10.1109/APCHI.1998.704336
Ishii A, Campbell N, Yasumura M. Emotional speech as an effective interface for people with special needs. In Proceedings - 3rd Asia Pacific Computer Human Interaction, APCHI 1998. Institute of Electrical and Electronics Engineers Inc. 1998. p. 266-271. 704336 https://doi.org/10.1109/APCHI.1998.704336
Ishii, Akemi ; Campbell, Nick ; Yasumura, Michiaki. / Emotional speech as an effective interface for people with special needs. Proceedings - 3rd Asia Pacific Computer Human Interaction, APCHI 1998. Institute of Electrical and Electronics Engineers Inc., 1998. pp. 266-271
@inproceedings{fdb0aacd0c8c48588b755b4a882cb553,
title = "Emotional speech as an effective interface for people with special needs",
abstract = "The paper describes an application concept of an affective communication system for people with disabilities and elderly people, summarizes the universal nature of emotion and its vocal expression, and reports on the work on designing a corpus database of emotional speech for a speech synthesis in the proposed system. Three corpora of emotional speech (joy, anger and sadness) have been designed and tested for the use with CHATR, the concatenated speech synthesis system at ATR. Each text corpus was designed to bring out a speaker's emotion. The result of perceptual experiments was proved to be significant and so was the result of CHATR synthesized speech. This indicates that the subjects successfully identified the emotion types of the synthesized speech from implicit phonetic information and hence this study has proved the validity of using a corpus of emotional speech as a database for the concatenated speech synthesis system.",
author = "Akemi Ishii and Nick Campbell and Michiaki Yasumura",
year = "1998",
month = "1",
day = "1",
doi = "10.1109/APCHI.1998.704336",
language = "English",
isbn = "0818683473",
pages = "266--271",
booktitle = "Proceedings - 3rd Asia Pacific Computer Human Interaction, APCHI 1998",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
address = "United States",

}

TY - GEN

T1 - Emotional speech as an effective interface for people with special needs

AU - Ishii, Akemi

AU - Campbell, Nick

AU - Yasumura, Michiaki

PY - 1998/1/1

Y1 - 1998/1/1

N2 - The paper describes an application concept of an affective communication system for people with disabilities and elderly people, summarizes the universal nature of emotion and its vocal expression, and reports on the work on designing a corpus database of emotional speech for a speech synthesis in the proposed system. Three corpora of emotional speech (joy, anger and sadness) have been designed and tested for the use with CHATR, the concatenated speech synthesis system at ATR. Each text corpus was designed to bring out a speaker's emotion. The result of perceptual experiments was proved to be significant and so was the result of CHATR synthesized speech. This indicates that the subjects successfully identified the emotion types of the synthesized speech from implicit phonetic information and hence this study has proved the validity of using a corpus of emotional speech as a database for the concatenated speech synthesis system.

AB - The paper describes an application concept of an affective communication system for people with disabilities and elderly people, summarizes the universal nature of emotion and its vocal expression, and reports on the work on designing a corpus database of emotional speech for a speech synthesis in the proposed system. Three corpora of emotional speech (joy, anger and sadness) have been designed and tested for the use with CHATR, the concatenated speech synthesis system at ATR. Each text corpus was designed to bring out a speaker's emotion. The result of perceptual experiments was proved to be significant and so was the result of CHATR synthesized speech. This indicates that the subjects successfully identified the emotion types of the synthesized speech from implicit phonetic information and hence this study has proved the validity of using a corpus of emotional speech as a database for the concatenated speech synthesis system.

UR - http://www.scopus.com/inward/record.url?scp=85009093315&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85009093315&partnerID=8YFLogxK

U2 - 10.1109/APCHI.1998.704336

DO - 10.1109/APCHI.1998.704336

M3 - Conference contribution

AN - SCOPUS:85009093315

SN - 0818683473

SN - 9780818683473

SP - 266

EP - 271

BT - Proceedings - 3rd Asia Pacific Computer Human Interaction, APCHI 1998

PB - Institute of Electrical and Electronics Engineers Inc.

ER -