Adaptive learning of hand movement in human demonstration for robot action

Ngoc Hung Pham, Takashi Yoshimi

研究成果: Article

1 引用 (Scopus)

抄録

This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.

元の言語English
ページ(範囲)919-927
ページ数9
ジャーナルJournal of Robotics and Mechatronics
29
発行部数5
DOI
出版物ステータスPublished - 2017 10 1

Fingerprint

End effectors
Demonstrations
Robots
Grippers

ASJC Scopus subject areas

  • Computer Science(all)
  • Electrical and Electronic Engineering

これを引用

Adaptive learning of hand movement in human demonstration for robot action. / Pham, Ngoc Hung; Yoshimi, Takashi.

:: Journal of Robotics and Mechatronics, 巻 29, 番号 5, 01.10.2017, p. 919-927.

研究成果: Article

@article{2db7e9d8d1e848a4883bef8a6b2d6708,
title = "Adaptive learning of hand movement in human demonstration for robot action",
abstract = "This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.",
keywords = "Dynamic movement primitives, Hand movements, Learning from demonstration, Robot actions",
author = "Pham, {Ngoc Hung} and Takashi Yoshimi",
year = "2017",
month = "10",
day = "1",
doi = "10.20965/jrm.2017.p0919",
language = "English",
volume = "29",
pages = "919--927",
journal = "Journal of Robotics and Mechatronics",
issn = "0915-3942",
publisher = "Fuji Technology Press",
number = "5",

}

TY - JOUR

T1 - Adaptive learning of hand movement in human demonstration for robot action

AU - Pham, Ngoc Hung

AU - Yoshimi, Takashi

PY - 2017/10/1

Y1 - 2017/10/1

N2 - This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.

AB - This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.

KW - Dynamic movement primitives

KW - Hand movements

KW - Learning from demonstration

KW - Robot actions

UR - http://www.scopus.com/inward/record.url?scp=85031935121&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85031935121&partnerID=8YFLogxK

U2 - 10.20965/jrm.2017.p0919

DO - 10.20965/jrm.2017.p0919

M3 - Article

AN - SCOPUS:85031935121

VL - 29

SP - 919

EP - 927

JO - Journal of Robotics and Mechatronics

JF - Journal of Robotics and Mechatronics

SN - 0915-3942

IS - 5

ER -