Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots

Teruko Yata, Akihisa Ohya, Shinichi Yuta

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Citations (Scopus)

Abstract

This paper propose a new method of sensor fusion of an omni-directional ultrasonic sensor and an omni-directional vision sensor. A new omni-directional sonar, which we developed, can measure accurate distance and direction of reflecting points, and an omni-directional vision can give direction to edges of segment. We propose a sensor fusion method using both the reflecting points measured by the sonar and the edges of segment measured by the vision, based on the angles. Those data are different in character, so they compensate each other in the proposed method, and it becomes possible to get better information which are useful for environment recognition of mobile robots. We describe the proposed method and an experimental result to show its potential.

Original languageEnglish
Title of host publicationProceedings - IEEE International Conference on Robotics and Automation
Pages3925-3930
Number of pages6
Volume4
Publication statusPublished - 2000
Externally publishedYes
EventICRA 2000: IEEE International Conference on Robotics and Automation - San Francisco, CA, USA
Duration: 2000 Apr 242000 Apr 28

Other

OtherICRA 2000: IEEE International Conference on Robotics and Automation
CitySan Francisco, CA, USA
Period00/4/2400/4/28

Fingerprint

Sonar
Mobile robots
Fusion reactions
Sensors
Ultrasonic sensors

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering

Cite this

Yata, T., Ohya, A., & Yuta, S. (2000). Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots. In Proceedings - IEEE International Conference on Robotics and Automation (Vol. 4, pp. 3925-3930)

Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots. / Yata, Teruko; Ohya, Akihisa; Yuta, Shinichi.

Proceedings - IEEE International Conference on Robotics and Automation. Vol. 4 2000. p. 3925-3930.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yata, T, Ohya, A & Yuta, S 2000, Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots. in Proceedings - IEEE International Conference on Robotics and Automation. vol. 4, pp. 3925-3930, ICRA 2000: IEEE International Conference on Robotics and Automation, San Francisco, CA, USA, 00/4/24.
Yata T, Ohya A, Yuta S. Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots. In Proceedings - IEEE International Conference on Robotics and Automation. Vol. 4. 2000. p. 3925-3930
Yata, Teruko ; Ohya, Akihisa ; Yuta, Shinichi. / Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots. Proceedings - IEEE International Conference on Robotics and Automation. Vol. 4 2000. pp. 3925-3930
@inproceedings{5db9789024d14628a5b5b5f977db5bca,
title = "Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots",
abstract = "This paper propose a new method of sensor fusion of an omni-directional ultrasonic sensor and an omni-directional vision sensor. A new omni-directional sonar, which we developed, can measure accurate distance and direction of reflecting points, and an omni-directional vision can give direction to edges of segment. We propose a sensor fusion method using both the reflecting points measured by the sonar and the edges of segment measured by the vision, based on the angles. Those data are different in character, so they compensate each other in the proposed method, and it becomes possible to get better information which are useful for environment recognition of mobile robots. We describe the proposed method and an experimental result to show its potential.",
author = "Teruko Yata and Akihisa Ohya and Shinichi Yuta",
year = "2000",
language = "English",
volume = "4",
pages = "3925--3930",
booktitle = "Proceedings - IEEE International Conference on Robotics and Automation",

}

TY - GEN

T1 - Fusion of omni-directional sonar and omni-directional vision for environment recognition of mobile robots

AU - Yata, Teruko

AU - Ohya, Akihisa

AU - Yuta, Shinichi

PY - 2000

Y1 - 2000

N2 - This paper propose a new method of sensor fusion of an omni-directional ultrasonic sensor and an omni-directional vision sensor. A new omni-directional sonar, which we developed, can measure accurate distance and direction of reflecting points, and an omni-directional vision can give direction to edges of segment. We propose a sensor fusion method using both the reflecting points measured by the sonar and the edges of segment measured by the vision, based on the angles. Those data are different in character, so they compensate each other in the proposed method, and it becomes possible to get better information which are useful for environment recognition of mobile robots. We describe the proposed method and an experimental result to show its potential.

AB - This paper propose a new method of sensor fusion of an omni-directional ultrasonic sensor and an omni-directional vision sensor. A new omni-directional sonar, which we developed, can measure accurate distance and direction of reflecting points, and an omni-directional vision can give direction to edges of segment. We propose a sensor fusion method using both the reflecting points measured by the sonar and the edges of segment measured by the vision, based on the angles. Those data are different in character, so they compensate each other in the proposed method, and it becomes possible to get better information which are useful for environment recognition of mobile robots. We describe the proposed method and an experimental result to show its potential.

UR - http://www.scopus.com/inward/record.url?scp=0033718316&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0033718316&partnerID=8YFLogxK

M3 - Conference contribution

VL - 4

SP - 3925

EP - 3930

BT - Proceedings - IEEE International Conference on Robotics and Automation

ER -