Real-time floor recognition in indoor environments using TOF Camera

Masafumi Nakagawa, Tamaki Kobayashi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Navigation in public spaces is an essential technique for pedestrian guidance and autonomous robot assistance. Navigation systems mainly consist of positioning systems, maps, route finding, and a user interface. Image acquisition or 3D sensing is also required to recognize the unknown environment around a pedestrian or robot. Therefore, we aim to develop a sense-and-avoid application for pedestrians and autonomous robots. We propose a real-time methodology to recognize the mobility area in an indoor environment using an active depth imager. In our experiments, we used a handheld time-of-flight (TOF) camera. We selected corridors, stairs, large rooms, and our laboratory in our campus as study areas. These areas consist of walls, glass walls, steps, and gaping holes. These objects were recognized with online processing, and notification for obstacle avoidance was issued after object recognition. Our experiments confirmed that our approach can process 3D measurements, classify objects, and notify for obstacle avoidance in real time.

Original languageEnglish
Title of host publication37th Asian Conference on Remote Sensing, ACRS 2016
PublisherAsian Association on Remote Sensing
Pages399-402
Number of pages4
Volume1
ISBN (Electronic)9781510834613
Publication statusPublished - 2016
Event37th Asian Conference on Remote Sensing, ACRS 2016 - Colombo, Sri Lanka
Duration: 2016 Oct 172016 Oct 21

Other

Other37th Asian Conference on Remote Sensing, ACRS 2016
CountrySri Lanka
CityColombo
Period16/10/1716/10/21

Fingerprint

Bridge decks
Cameras
Robots
Collision avoidance
Stairs
Image acquisition
Object recognition
Navigation systems
Image sensors
User interfaces
Navigation
Experiments
Glass
Processing

Keywords

  • Indoor navigation
  • Object recognition
  • Point cloud
  • Time-of-flight camera

ASJC Scopus subject areas

  • Computer Networks and Communications

Cite this

Nakagawa, M., & Kobayashi, T. (2016). Real-time floor recognition in indoor environments using TOF Camera. In 37th Asian Conference on Remote Sensing, ACRS 2016 (Vol. 1, pp. 399-402). Asian Association on Remote Sensing.

Real-time floor recognition in indoor environments using TOF Camera. / Nakagawa, Masafumi; Kobayashi, Tamaki.

37th Asian Conference on Remote Sensing, ACRS 2016. Vol. 1 Asian Association on Remote Sensing, 2016. p. 399-402.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Nakagawa, M & Kobayashi, T 2016, Real-time floor recognition in indoor environments using TOF Camera. in 37th Asian Conference on Remote Sensing, ACRS 2016. vol. 1, Asian Association on Remote Sensing, pp. 399-402, 37th Asian Conference on Remote Sensing, ACRS 2016, Colombo, Sri Lanka, 16/10/17.
Nakagawa M, Kobayashi T. Real-time floor recognition in indoor environments using TOF Camera. In 37th Asian Conference on Remote Sensing, ACRS 2016. Vol. 1. Asian Association on Remote Sensing. 2016. p. 399-402
Nakagawa, Masafumi ; Kobayashi, Tamaki. / Real-time floor recognition in indoor environments using TOF Camera. 37th Asian Conference on Remote Sensing, ACRS 2016. Vol. 1 Asian Association on Remote Sensing, 2016. pp. 399-402
@inproceedings{2ee3be5320074f7db11da20a6f35dcc6,
title = "Real-time floor recognition in indoor environments using TOF Camera",
abstract = "Navigation in public spaces is an essential technique for pedestrian guidance and autonomous robot assistance. Navigation systems mainly consist of positioning systems, maps, route finding, and a user interface. Image acquisition or 3D sensing is also required to recognize the unknown environment around a pedestrian or robot. Therefore, we aim to develop a sense-and-avoid application for pedestrians and autonomous robots. We propose a real-time methodology to recognize the mobility area in an indoor environment using an active depth imager. In our experiments, we used a handheld time-of-flight (TOF) camera. We selected corridors, stairs, large rooms, and our laboratory in our campus as study areas. These areas consist of walls, glass walls, steps, and gaping holes. These objects were recognized with online processing, and notification for obstacle avoidance was issued after object recognition. Our experiments confirmed that our approach can process 3D measurements, classify objects, and notify for obstacle avoidance in real time.",
keywords = "Indoor navigation, Object recognition, Point cloud, Time-of-flight camera",
author = "Masafumi Nakagawa and Tamaki Kobayashi",
year = "2016",
language = "English",
volume = "1",
pages = "399--402",
booktitle = "37th Asian Conference on Remote Sensing, ACRS 2016",
publisher = "Asian Association on Remote Sensing",

}

TY - GEN

T1 - Real-time floor recognition in indoor environments using TOF Camera

AU - Nakagawa, Masafumi

AU - Kobayashi, Tamaki

PY - 2016

Y1 - 2016

N2 - Navigation in public spaces is an essential technique for pedestrian guidance and autonomous robot assistance. Navigation systems mainly consist of positioning systems, maps, route finding, and a user interface. Image acquisition or 3D sensing is also required to recognize the unknown environment around a pedestrian or robot. Therefore, we aim to develop a sense-and-avoid application for pedestrians and autonomous robots. We propose a real-time methodology to recognize the mobility area in an indoor environment using an active depth imager. In our experiments, we used a handheld time-of-flight (TOF) camera. We selected corridors, stairs, large rooms, and our laboratory in our campus as study areas. These areas consist of walls, glass walls, steps, and gaping holes. These objects were recognized with online processing, and notification for obstacle avoidance was issued after object recognition. Our experiments confirmed that our approach can process 3D measurements, classify objects, and notify for obstacle avoidance in real time.

AB - Navigation in public spaces is an essential technique for pedestrian guidance and autonomous robot assistance. Navigation systems mainly consist of positioning systems, maps, route finding, and a user interface. Image acquisition or 3D sensing is also required to recognize the unknown environment around a pedestrian or robot. Therefore, we aim to develop a sense-and-avoid application for pedestrians and autonomous robots. We propose a real-time methodology to recognize the mobility area in an indoor environment using an active depth imager. In our experiments, we used a handheld time-of-flight (TOF) camera. We selected corridors, stairs, large rooms, and our laboratory in our campus as study areas. These areas consist of walls, glass walls, steps, and gaping holes. These objects were recognized with online processing, and notification for obstacle avoidance was issued after object recognition. Our experiments confirmed that our approach can process 3D measurements, classify objects, and notify for obstacle avoidance in real time.

KW - Indoor navigation

KW - Object recognition

KW - Point cloud

KW - Time-of-flight camera

UR - http://www.scopus.com/inward/record.url?scp=85018286956&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85018286956&partnerID=8YFLogxK

M3 - Conference contribution

VL - 1

SP - 399

EP - 402

BT - 37th Asian Conference on Remote Sensing, ACRS 2016

PB - Asian Association on Remote Sensing

ER -