Tracking and detection of pointing gesture in 3D space

Nur Safwati Mohd Nor, Ngo Lam Trung, Yoshio Maeda, Makoto Mizukawa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

In this paper, we discuss pointing gesture estimation in the context of Human Robot Interaction (HRI). Pointing gesture provides and utilizes information in Kukanchi (Interactive Human Space Design and Intelligence) more natural than other conservative method like keyboard or mouse. Therefore, we design a user interface adopting pointing gesture to manipulate objects and robot naturally in 3D space. Initially, human posture is tracked. Later, the information is used to detect the dynamic pointing gesture using our proposed method which is Hidden Markov Model (HMM). Preliminary experiment shows the average angular error of pointing direction is less than seven degrees.

Original languageEnglish
Title of host publication2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2012
Pages234-235
Number of pages2
DOIs
Publication statusPublished - 2012
Event2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2012 - Daejeon
Duration: 2012 Nov 262012 Nov 29

Other

Other2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2012
CityDaejeon
Period12/11/2612/11/29

    Fingerprint

Keywords

  • Hidden Markov Model (HMM)
  • Natural user interface
  • Pointing gesture

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction

Cite this

Nor, N. S. M., Trung, N. L., Maeda, Y., & Mizukawa, M. (2012). Tracking and detection of pointing gesture in 3D space. In 2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2012 (pp. 234-235). [6462983] https://doi.org/10.1109/URAI.2012.6462983