Mobile robot self-localization based on tracked scale and rotation invariant feature points by using an omnidirectional camera

Tsuyoshi Tasaki, Seiji Tokura, Takafumi Sonoura, Fumio Ozaki, Nobuto Matsuhira

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Self-localization is important for mobile robots in order to move accurately, and many works use an omnidirectional camera for self-localization. However, it is difficult to realize fast and accurate self-localization by using only one omnidirectional camera without any calibration. For its realization, we use "tracked scale and rotation invariant feature points" that are regarded as landmarks. These landmarks can be tracked and do not change for a "long" time. In a landmark selection phase, robots detect the feature points by using both a fast tracking method and a slow "Speed Up Robust Features (SURF)" method. After detection, robots select landmarks from among detected feature points by using Support Vector Machine (SVM) trained by feature vectors based on observation positions. In a self-localization phase, robots detect landmarks while switching detection methods dynamically based on a tracking error criterion that is calculated easily even in the uncalibrated omnidirectional image. We performed experiments in an approximately 10 [m] x 10 [m] mock supermarket by using a navigation robot ApriTau™ that had an omnidirectional camera on its top. The results showed that ApriTau™ could localize 2.9 times faster and 4.2 times more accurately by using the developed method than by using only the SURF method. The results also showed that ApriTau™ could arrive at a goal within a 3 [cm] error from various initial positions at the mock supermarket.

Original languageEnglish
Title of host publicationIEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings
Pages5202-5207
Number of pages6
DOIs
Publication statusPublished - 2010
Externally publishedYes
Event23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Taipei, Taiwan, Province of China
Duration: 2010 Oct 182010 Oct 22

Other

Other23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010
CountryTaiwan, Province of China
CityTaipei
Period10/10/1810/10/22

Fingerprint

Mobile robots
Cameras
Robots
Support vector machines
Navigation
Calibration
Experiments

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Control and Systems Engineering

Cite this

Tasaki, T., Tokura, S., Sonoura, T., Ozaki, F., & Matsuhira, N. (2010). Mobile robot self-localization based on tracked scale and rotation invariant feature points by using an omnidirectional camera. In IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings (pp. 5202-5207). [5649848] https://doi.org/10.1109/IROS.2010.5649848

Mobile robot self-localization based on tracked scale and rotation invariant feature points by using an omnidirectional camera. / Tasaki, Tsuyoshi; Tokura, Seiji; Sonoura, Takafumi; Ozaki, Fumio; Matsuhira, Nobuto.

IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings. 2010. p. 5202-5207 5649848.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Tasaki, T, Tokura, S, Sonoura, T, Ozaki, F & Matsuhira, N 2010, Mobile robot self-localization based on tracked scale and rotation invariant feature points by using an omnidirectional camera. in IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings., 5649848, pp. 5202-5207, 23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010, Taipei, Taiwan, Province of China, 10/10/18. https://doi.org/10.1109/IROS.2010.5649848
Tasaki T, Tokura S, Sonoura T, Ozaki F, Matsuhira N. Mobile robot self-localization based on tracked scale and rotation invariant feature points by using an omnidirectional camera. In IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings. 2010. p. 5202-5207. 5649848 https://doi.org/10.1109/IROS.2010.5649848
Tasaki, Tsuyoshi ; Tokura, Seiji ; Sonoura, Takafumi ; Ozaki, Fumio ; Matsuhira, Nobuto. / Mobile robot self-localization based on tracked scale and rotation invariant feature points by using an omnidirectional camera. IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings. 2010. pp. 5202-5207
@inproceedings{fd670c2b7fec47df8d36094c013f6099,
title = "Mobile robot self-localization based on tracked scale and rotation invariant feature points by using an omnidirectional camera",
abstract = "Self-localization is important for mobile robots in order to move accurately, and many works use an omnidirectional camera for self-localization. However, it is difficult to realize fast and accurate self-localization by using only one omnidirectional camera without any calibration. For its realization, we use {"}tracked scale and rotation invariant feature points{"} that are regarded as landmarks. These landmarks can be tracked and do not change for a {"}long{"} time. In a landmark selection phase, robots detect the feature points by using both a fast tracking method and a slow {"}Speed Up Robust Features (SURF){"} method. After detection, robots select landmarks from among detected feature points by using Support Vector Machine (SVM) trained by feature vectors based on observation positions. In a self-localization phase, robots detect landmarks while switching detection methods dynamically based on a tracking error criterion that is calculated easily even in the uncalibrated omnidirectional image. We performed experiments in an approximately 10 [m] x 10 [m] mock supermarket by using a navigation robot ApriTau™ that had an omnidirectional camera on its top. The results showed that ApriTau™ could localize 2.9 times faster and 4.2 times more accurately by using the developed method than by using only the SURF method. The results also showed that ApriTau™ could arrive at a goal within a 3 [cm] error from various initial positions at the mock supermarket.",
author = "Tsuyoshi Tasaki and Seiji Tokura and Takafumi Sonoura and Fumio Ozaki and Nobuto Matsuhira",
year = "2010",
doi = "10.1109/IROS.2010.5649848",
language = "English",
isbn = "9781424466757",
pages = "5202--5207",
booktitle = "IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings",

}

TY - GEN

T1 - Mobile robot self-localization based on tracked scale and rotation invariant feature points by using an omnidirectional camera

AU - Tasaki, Tsuyoshi

AU - Tokura, Seiji

AU - Sonoura, Takafumi

AU - Ozaki, Fumio

AU - Matsuhira, Nobuto

PY - 2010

Y1 - 2010

N2 - Self-localization is important for mobile robots in order to move accurately, and many works use an omnidirectional camera for self-localization. However, it is difficult to realize fast and accurate self-localization by using only one omnidirectional camera without any calibration. For its realization, we use "tracked scale and rotation invariant feature points" that are regarded as landmarks. These landmarks can be tracked and do not change for a "long" time. In a landmark selection phase, robots detect the feature points by using both a fast tracking method and a slow "Speed Up Robust Features (SURF)" method. After detection, robots select landmarks from among detected feature points by using Support Vector Machine (SVM) trained by feature vectors based on observation positions. In a self-localization phase, robots detect landmarks while switching detection methods dynamically based on a tracking error criterion that is calculated easily even in the uncalibrated omnidirectional image. We performed experiments in an approximately 10 [m] x 10 [m] mock supermarket by using a navigation robot ApriTau™ that had an omnidirectional camera on its top. The results showed that ApriTau™ could localize 2.9 times faster and 4.2 times more accurately by using the developed method than by using only the SURF method. The results also showed that ApriTau™ could arrive at a goal within a 3 [cm] error from various initial positions at the mock supermarket.

AB - Self-localization is important for mobile robots in order to move accurately, and many works use an omnidirectional camera for self-localization. However, it is difficult to realize fast and accurate self-localization by using only one omnidirectional camera without any calibration. For its realization, we use "tracked scale and rotation invariant feature points" that are regarded as landmarks. These landmarks can be tracked and do not change for a "long" time. In a landmark selection phase, robots detect the feature points by using both a fast tracking method and a slow "Speed Up Robust Features (SURF)" method. After detection, robots select landmarks from among detected feature points by using Support Vector Machine (SVM) trained by feature vectors based on observation positions. In a self-localization phase, robots detect landmarks while switching detection methods dynamically based on a tracking error criterion that is calculated easily even in the uncalibrated omnidirectional image. We performed experiments in an approximately 10 [m] x 10 [m] mock supermarket by using a navigation robot ApriTau™ that had an omnidirectional camera on its top. The results showed that ApriTau™ could localize 2.9 times faster and 4.2 times more accurately by using the developed method than by using only the SURF method. The results also showed that ApriTau™ could arrive at a goal within a 3 [cm] error from various initial positions at the mock supermarket.

UR - http://www.scopus.com/inward/record.url?scp=78651480013&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78651480013&partnerID=8YFLogxK

U2 - 10.1109/IROS.2010.5649848

DO - 10.1109/IROS.2010.5649848

M3 - Conference contribution

AN - SCOPUS:78651480013

SN - 9781424466757

SP - 5202

EP - 5207

BT - IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings

ER -