Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots

Teruko Yata, Akihisa Ohya, Shin'ichi Yuta

Research output: Contribution to journalConference article

10 Citations (Scopus)

Abstract

This paper propose a new method of sensor fusion of an omni-directional ultrasonic sensor and an omni-directional vision sensor. A new omni-directional sonar, which we developed, can measure accurate distance and direction of reflecting points, and an omni-directional vision can give direction to edges of segment. We propose a sensor fusion method using both the reflecting points measured by the sonar and the edges of segment measured by the vision, based on the angles. Those data are different in character, so they compensate each other in the proposed method, and it becomes possible to get better information which are useful for environment recognition of mobile robots. We describe the proposed method and an experimental result to show its potential.

Original languageEnglish
Pages (from-to)3925-3930
Number of pages6
JournalProceedings - IEEE International Conference on Robotics and Automation
Volume4
Publication statusPublished - 2000 Dec 3
EventICRA 2000: IEEE International Conference on Robotics and Automation - San Francisco, CA, USA
Duration: 2000 Apr 242000 Apr 28

    Fingerprint

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Artificial Intelligence
  • Electrical and Electronic Engineering

Cite this