A Color Mask and Trained Image Set for the Creation of New Technique for Indoor Robotic Navigation

K. M.H.K. Warnakulasooriya, B. H. Sudantha, C. Premachandra

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Among the different methods used in the indoor navigation of robotic systems image processing techniques are widely used. This research was focused to explore a new dimension of image capturing and matching in order to enhance the indoor navigation methods of moving robots. The color extraction were carried out by performing the logical operation between a color mask and the original image. The color mask would be created considering the bitwise range analysis. The real comparison has been received analyzing the histograms received from color extraction and the trained source image.

Original languageEnglish
Title of host publication2018 3rd International Conference on Information Technology Research, ICITR 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728114705
DOIs
Publication statusPublished - 2018 Dec
Event3rd International Conference on Information Technology Research, ICITR 2018 - Moratuwa, Sri Lanka
Duration: 2018 Dec 52018 Dec 7

Publication series

Name2018 3rd International Conference on Information Technology Research, ICITR 2018

Conference

Conference3rd International Conference on Information Technology Research, ICITR 2018
CountrySri Lanka
CityMoratuwa
Period18/12/518/12/7

Keywords

  • color detection
  • color histogram
  • color masking
  • image comparison
  • image training
  • robotic indoor navigation

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Networks and Communications
  • Computer Science Applications
  • Information Systems and Management
  • Media Technology

Fingerprint Dive into the research topics of 'A Color Mask and Trained Image Set for the Creation of New Technique for Indoor Robotic Navigation'. Together they form a unique fingerprint.

Cite this