Adaptive learning of hand movement in human demonstration for robot action

Ngoc Hung Pham, Takashi Yoshimi

Research output: Contribution to journalArticle

1 Citation (Scopus)

Abstract

This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.

Original languageEnglish
Pages (from-to)919-927
Number of pages9
JournalJournal of Robotics and Mechatronics
Volume29
Issue number5
DOIs
Publication statusPublished - 2017 Oct 1

Fingerprint

End effectors
Demonstrations
Robots
Grippers

Keywords

  • Dynamic movement primitives
  • Hand movements
  • Learning from demonstration
  • Robot actions

ASJC Scopus subject areas

  • Computer Science(all)
  • Electrical and Electronic Engineering

Cite this

Adaptive learning of hand movement in human demonstration for robot action. / Pham, Ngoc Hung; Yoshimi, Takashi.

In: Journal of Robotics and Mechatronics, Vol. 29, No. 5, 01.10.2017, p. 919-927.

Research output: Contribution to journalArticle

@article{2db7e9d8d1e848a4883bef8a6b2d6708,
title = "Adaptive learning of hand movement in human demonstration for robot action",
abstract = "This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.",
keywords = "Dynamic movement primitives, Hand movements, Learning from demonstration, Robot actions",
author = "Pham, {Ngoc Hung} and Takashi Yoshimi",
year = "2017",
month = "10",
day = "1",
doi = "10.20965/jrm.2017.p0919",
language = "English",
volume = "29",
pages = "919--927",
journal = "Journal of Robotics and Mechatronics",
issn = "0915-3942",
publisher = "Fuji Technology Press",
number = "5",

}

TY - JOUR

T1 - Adaptive learning of hand movement in human demonstration for robot action

AU - Pham, Ngoc Hung

AU - Yoshimi, Takashi

PY - 2017/10/1

Y1 - 2017/10/1

N2 - This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.

AB - This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.

KW - Dynamic movement primitives

KW - Hand movements

KW - Learning from demonstration

KW - Robot actions

UR - http://www.scopus.com/inward/record.url?scp=85031935121&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85031935121&partnerID=8YFLogxK

U2 - 10.20965/jrm.2017.p0919

DO - 10.20965/jrm.2017.p0919

M3 - Article

VL - 29

SP - 919

EP - 927

JO - Journal of Robotics and Mechatronics

JF - Journal of Robotics and Mechatronics

SN - 0915-3942

IS - 5

ER -