Indoor navigation for mobile robots using memorized omni-directional images and robot's motion

Lixin Tang, Shinichi Yuta

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Citations (Scopus)

Abstract

We have proposed a navigation method for mobile robots in indoor environments based on a memorized sequence of images and robot's motion. The method consists of two stages: route teaching and playback navigation. At the teaching stage, the robot is controlled to move along a designated route in an environment, records robot's motion, and memorizes environmental images. In the course of autonomous navigation, the robot plays back the recorded data. It compares the currently taken image with the memorized one so that it estimates its position, and then calculates a trajectory to track the memorized route in real-time. We use an omni-directional camera to perceive environmental information, and extract the vertical edges existing in environment as landmarks in this method. In this paper, we report on the use of the color on both sides of vertical edges so as to match these edges more robustly, and explain the autonomous navigation in backward direction using the forward taught data.

Original languageEnglish
Title of host publicationIEEE International Conference on Intelligent Robots and Systems
Pages269-274
Number of pages6
Volume1
Publication statusPublished - 2002
Externally publishedYes
Event2002 IEEE/RSJ International Conference on Intelligent Robots and Systems - Lausanne
Duration: 2002 Sep 302002 Oct 4

Other

Other2002 IEEE/RSJ International Conference on Intelligent Robots and Systems
CityLausanne
Period02/9/3002/10/4

Fingerprint

Mobile robots
Navigation
Robots
Teaching
Cameras
Trajectories
Color

ASJC Scopus subject areas

  • Control and Systems Engineering

Cite this

Tang, L., & Yuta, S. (2002). Indoor navigation for mobile robots using memorized omni-directional images and robot's motion. In IEEE International Conference on Intelligent Robots and Systems (Vol. 1, pp. 269-274)

Indoor navigation for mobile robots using memorized omni-directional images and robot's motion. / Tang, Lixin; Yuta, Shinichi.

IEEE International Conference on Intelligent Robots and Systems. Vol. 1 2002. p. 269-274.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Tang, L & Yuta, S 2002, Indoor navigation for mobile robots using memorized omni-directional images and robot's motion. in IEEE International Conference on Intelligent Robots and Systems. vol. 1, pp. 269-274, 2002 IEEE/RSJ International Conference on Intelligent Robots and Systems, Lausanne, 02/9/30.
Tang L, Yuta S. Indoor navigation for mobile robots using memorized omni-directional images and robot's motion. In IEEE International Conference on Intelligent Robots and Systems. Vol. 1. 2002. p. 269-274
Tang, Lixin ; Yuta, Shinichi. / Indoor navigation for mobile robots using memorized omni-directional images and robot's motion. IEEE International Conference on Intelligent Robots and Systems. Vol. 1 2002. pp. 269-274
@inproceedings{9c20b91533a14e919f7cdd8286f6cd5b,
title = "Indoor navigation for mobile robots using memorized omni-directional images and robot's motion",
abstract = "We have proposed a navigation method for mobile robots in indoor environments based on a memorized sequence of images and robot's motion. The method consists of two stages: route teaching and playback navigation. At the teaching stage, the robot is controlled to move along a designated route in an environment, records robot's motion, and memorizes environmental images. In the course of autonomous navigation, the robot plays back the recorded data. It compares the currently taken image with the memorized one so that it estimates its position, and then calculates a trajectory to track the memorized route in real-time. We use an omni-directional camera to perceive environmental information, and extract the vertical edges existing in environment as landmarks in this method. In this paper, we report on the use of the color on both sides of vertical edges so as to match these edges more robustly, and explain the autonomous navigation in backward direction using the forward taught data.",
author = "Lixin Tang and Shinichi Yuta",
year = "2002",
language = "English",
volume = "1",
pages = "269--274",
booktitle = "IEEE International Conference on Intelligent Robots and Systems",

}

TY - GEN

T1 - Indoor navigation for mobile robots using memorized omni-directional images and robot's motion

AU - Tang, Lixin

AU - Yuta, Shinichi

PY - 2002

Y1 - 2002

N2 - We have proposed a navigation method for mobile robots in indoor environments based on a memorized sequence of images and robot's motion. The method consists of two stages: route teaching and playback navigation. At the teaching stage, the robot is controlled to move along a designated route in an environment, records robot's motion, and memorizes environmental images. In the course of autonomous navigation, the robot plays back the recorded data. It compares the currently taken image with the memorized one so that it estimates its position, and then calculates a trajectory to track the memorized route in real-time. We use an omni-directional camera to perceive environmental information, and extract the vertical edges existing in environment as landmarks in this method. In this paper, we report on the use of the color on both sides of vertical edges so as to match these edges more robustly, and explain the autonomous navigation in backward direction using the forward taught data.

AB - We have proposed a navigation method for mobile robots in indoor environments based on a memorized sequence of images and robot's motion. The method consists of two stages: route teaching and playback navigation. At the teaching stage, the robot is controlled to move along a designated route in an environment, records robot's motion, and memorizes environmental images. In the course of autonomous navigation, the robot plays back the recorded data. It compares the currently taken image with the memorized one so that it estimates its position, and then calculates a trajectory to track the memorized route in real-time. We use an omni-directional camera to perceive environmental information, and extract the vertical edges existing in environment as landmarks in this method. In this paper, we report on the use of the color on both sides of vertical edges so as to match these edges more robustly, and explain the autonomous navigation in backward direction using the forward taught data.

UR - http://www.scopus.com/inward/record.url?scp=0036448779&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0036448779&partnerID=8YFLogxK

M3 - Conference contribution

VL - 1

SP - 269

EP - 274

BT - IEEE International Conference on Intelligent Robots and Systems

ER -