TY - JOUR
T1 - Head pose estimation and motion analysis of public speaking videos
AU - Komiya, Rinko
AU - Saitoh, Takeshi
AU - Fuyuno, Miharu
AU - Yamashita, Yuko
AU - Nakajima, Yoshitaka
N1 - Publisher Copyright:
Copyright © 2017, IGI Global.
Copyright:
Copyright 2017 Elsevier B.V., All rights reserved.
PY - 2017/1/1
Y1 - 2017/1/1
N2 - Public speaking is an essential skill in a large variety of professions and also in everyday life. However, it can be difficult to master. This paper focuses on the automatic assessment of nonverbal facial behavior during public speaking and proposes simple and efficient methods of head pose estimation and motion analysis. The authors collected nine and six speech videos from a recitation and oration contest, respectively, conducted at a Japanese high school and applied the proposed method to evaluate the contestants' performance. For the estimation of head pose from speech videos, their method produced results with an acceptable level of accuracy. The proposed motion analysis method can be used for calculating frequencies and moving ranges of head motion. The authors found that the proposed parameters and the eye-contact score are strongly correlated and that the proposed frequency and moving range parameters are suitable for evaluating public speaking. Thus, on the basis of these features, a teacher can provide accurate feedback to help a speaker improve.
AB - Public speaking is an essential skill in a large variety of professions and also in everyday life. However, it can be difficult to master. This paper focuses on the automatic assessment of nonverbal facial behavior during public speaking and proposes simple and efficient methods of head pose estimation and motion analysis. The authors collected nine and six speech videos from a recitation and oration contest, respectively, conducted at a Japanese high school and applied the proposed method to evaluate the contestants' performance. For the estimation of head pose from speech videos, their method produced results with an acceptable level of accuracy. The proposed motion analysis method can be used for calculating frequencies and moving ranges of head motion. The authors found that the proposed parameters and the eye-contact score are strongly correlated and that the proposed frequency and moving range parameters are suitable for evaluating public speaking. Thus, on the basis of these features, a teacher can provide accurate feedback to help a speaker improve.
KW - English Oration Contest
KW - English Recitation Contest
KW - Facial Feature Point
KW - Head Pose Estimation
KW - Image Processing
KW - Motion Analysis
KW - Speech Video
UR - http://www.scopus.com/inward/record.url?scp=85024120529&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85024120529&partnerID=8YFLogxK
U2 - 10.4018/IJSI.2017010105
DO - 10.4018/IJSI.2017010105
M3 - Article
AN - SCOPUS:85024120529
VL - 5
SP - 57
EP - 71
JO - International Journal of Software Innovation
JF - International Journal of Software Innovation
SN - 2166-7160
IS - 1
ER -