Tracking and detection of pointing gesture in 3D space

Nur Safwati Mohd Nor, Ngo Lam Trung, Yoshio Maeda, Makoto Mizukawa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

In this paper, we discuss pointing gesture estimation in the context of Human Robot Interaction (HRI). Pointing gesture provides and utilizes information in Kukanchi (Interactive Human Space Design and Intelligence) more natural than other conservative method like keyboard or mouse. Therefore, we design a user interface adopting pointing gesture to manipulate objects and robot naturally in 3D space. Initially, human posture is tracked. Later, the information is used to detect the dynamic pointing gesture using our proposed method which is Hidden Markov Model (HMM). Preliminary experiment shows the average angular error of pointing direction is less than seven degrees.

Original languageEnglish
Title of host publication2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2012
Pages234-235
Number of pages2
DOIs
Publication statusPublished - 2012 Dec 1
Event2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2012 - Daejeon, Korea, Republic of
Duration: 2012 Nov 262012 Nov 29

Publication series

Name2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2012

Conference

Conference2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence, URAI 2012
Country/TerritoryKorea, Republic of
CityDaejeon
Period12/11/2612/11/29

Keywords

  • Hidden Markov Model (HMM)
  • Natural user interface
  • Pointing gesture

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Tracking and detection of pointing gesture in 3D space'. Together they form a unique fingerprint.

Cite this