Vision and laser sensor data fusion technique for target approaching by outdoor mobile robot

Aneesh Chand, Shinichi Yuta

Research output: Chapter in Book/Report/Conference proceedingConference contribution

7 Citations (Scopus)

Abstract

The authors have been developing an outdoor mobile robot intended to provide increased traveling distance by autonomously negotiating and crossing a road crossing intersection while traveling along pedestrian sidewalks in an urban environment. In this paper, high precision navigation towards a pedestrian push-button box by a mobile robot for the autonomous activation of the button is presented. We show a dual sensor fusion technique using a monocular camera and laser range sensor with which an outdoor mobile robot can detect, localize and then accurately navigate towards a button box so that it could autonomously press the pedestrian push button in order to trigger the crossing sequence. The method involves determining the image formation of the target on the image sensor of the camera, using it to estimate the object position in the real world and then using data from the laser range sensor to acquire a precise location of the object relative to the robot and finally perform the path planning. A two-tiered validation system, one at the vision level and the second at the laser scan data level, detects inaccurate detections and results in a robust system. The proposed method is also applicable for any form of target approaching. Experimental results verify the efficacy of the system and concluding remarks are also given.

Original languageEnglish
Title of host publication2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010
Pages1624-1629
Number of pages6
DOIs
Publication statusPublished - 2010
Externally publishedYes
Event2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010 - Tianjin
Duration: 2010 Dec 142010 Dec 18

Other

Other2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010
CityTianjin
Period10/12/1410/12/18

Fingerprint

Sensor data fusion
Mobile robots
Lasers
Sensors
Cameras
Motion planning
Image sensors
Navigation
Image processing
Fusion reactions
Chemical activation
Robots

ASJC Scopus subject areas

  • Artificial Intelligence
  • Biotechnology
  • Human-Computer Interaction

Cite this

Chand, A., & Yuta, S. (2010). Vision and laser sensor data fusion technique for target approaching by outdoor mobile robot. In 2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010 (pp. 1624-1629). [5723573] https://doi.org/10.1109/ROBIO.2010.5723573

Vision and laser sensor data fusion technique for target approaching by outdoor mobile robot. / Chand, Aneesh; Yuta, Shinichi.

2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010. 2010. p. 1624-1629 5723573.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Chand, A & Yuta, S 2010, Vision and laser sensor data fusion technique for target approaching by outdoor mobile robot. in 2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010., 5723573, pp. 1624-1629, 2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010, Tianjin, 10/12/14. https://doi.org/10.1109/ROBIO.2010.5723573
Chand A, Yuta S. Vision and laser sensor data fusion technique for target approaching by outdoor mobile robot. In 2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010. 2010. p. 1624-1629. 5723573 https://doi.org/10.1109/ROBIO.2010.5723573
Chand, Aneesh ; Yuta, Shinichi. / Vision and laser sensor data fusion technique for target approaching by outdoor mobile robot. 2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010. 2010. pp. 1624-1629
@inproceedings{52ea82526965422bbcdc4eebe32712ba,
title = "Vision and laser sensor data fusion technique for target approaching by outdoor mobile robot",
abstract = "The authors have been developing an outdoor mobile robot intended to provide increased traveling distance by autonomously negotiating and crossing a road crossing intersection while traveling along pedestrian sidewalks in an urban environment. In this paper, high precision navigation towards a pedestrian push-button box by a mobile robot for the autonomous activation of the button is presented. We show a dual sensor fusion technique using a monocular camera and laser range sensor with which an outdoor mobile robot can detect, localize and then accurately navigate towards a button box so that it could autonomously press the pedestrian push button in order to trigger the crossing sequence. The method involves determining the image formation of the target on the image sensor of the camera, using it to estimate the object position in the real world and then using data from the laser range sensor to acquire a precise location of the object relative to the robot and finally perform the path planning. A two-tiered validation system, one at the vision level and the second at the laser scan data level, detects inaccurate detections and results in a robust system. The proposed method is also applicable for any form of target approaching. Experimental results verify the efficacy of the system and concluding remarks are also given.",
author = "Aneesh Chand and Shinichi Yuta",
year = "2010",
doi = "10.1109/ROBIO.2010.5723573",
language = "English",
isbn = "9781424493173",
pages = "1624--1629",
booktitle = "2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010",

}

TY - GEN

T1 - Vision and laser sensor data fusion technique for target approaching by outdoor mobile robot

AU - Chand, Aneesh

AU - Yuta, Shinichi

PY - 2010

Y1 - 2010

N2 - The authors have been developing an outdoor mobile robot intended to provide increased traveling distance by autonomously negotiating and crossing a road crossing intersection while traveling along pedestrian sidewalks in an urban environment. In this paper, high precision navigation towards a pedestrian push-button box by a mobile robot for the autonomous activation of the button is presented. We show a dual sensor fusion technique using a monocular camera and laser range sensor with which an outdoor mobile robot can detect, localize and then accurately navigate towards a button box so that it could autonomously press the pedestrian push button in order to trigger the crossing sequence. The method involves determining the image formation of the target on the image sensor of the camera, using it to estimate the object position in the real world and then using data from the laser range sensor to acquire a precise location of the object relative to the robot and finally perform the path planning. A two-tiered validation system, one at the vision level and the second at the laser scan data level, detects inaccurate detections and results in a robust system. The proposed method is also applicable for any form of target approaching. Experimental results verify the efficacy of the system and concluding remarks are also given.

AB - The authors have been developing an outdoor mobile robot intended to provide increased traveling distance by autonomously negotiating and crossing a road crossing intersection while traveling along pedestrian sidewalks in an urban environment. In this paper, high precision navigation towards a pedestrian push-button box by a mobile robot for the autonomous activation of the button is presented. We show a dual sensor fusion technique using a monocular camera and laser range sensor with which an outdoor mobile robot can detect, localize and then accurately navigate towards a button box so that it could autonomously press the pedestrian push button in order to trigger the crossing sequence. The method involves determining the image formation of the target on the image sensor of the camera, using it to estimate the object position in the real world and then using data from the laser range sensor to acquire a precise location of the object relative to the robot and finally perform the path planning. A two-tiered validation system, one at the vision level and the second at the laser scan data level, detects inaccurate detections and results in a robust system. The proposed method is also applicable for any form of target approaching. Experimental results verify the efficacy of the system and concluding remarks are also given.

UR - http://www.scopus.com/inward/record.url?scp=79952922766&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=79952922766&partnerID=8YFLogxK

U2 - 10.1109/ROBIO.2010.5723573

DO - 10.1109/ROBIO.2010.5723573

M3 - Conference contribution

SN - 9781424493173

SP - 1624

EP - 1629

BT - 2010 IEEE International Conference on Robotics and Biomimetics, ROBIO 2010

ER -