Acquiring high quality 3D-image data in real time from multiple 3D-image sensors is problematic because of the following three difficulties: 1) latency, 2) calibration, and 3) limited network bandwidth. First, the delay from sensing to getting the detection results needs to be within a few hundreds of milliseconds. Second, time calibration and spatial calibration are essential when data are merged from multiple 3D-image sensors. Lastly, due to a large volume of the data from 3D-image sensors, a part of data cannot be delivered under limited bandwidth when data are merged via the network. In this paper, to tackle the difficulties, we propose a scheme for real-time 3D-image sensing with multiple 3D-image sensors. The proposed scheme consists of a data selection method to cut down the data volume while suppressing computational complexity, and a data merging method to merge data from multiple 3D-image sensors while satisfying the required periodical deadline. We develop a prototype system to implement the proposed scheme. We measure the processing delay and confirm that the proposed scheme satisfies the delay requirement. We also measure the quality of merged data collected from multiple 3D-image sensors using the proposed scheme, by performing experiments in laboratory and real fields. From the measurement results, we verify that the proposed scheme solves the difficulties; it merges data from multiple 3D-image sensors and maintains high data quality in real time by reducing the data volume against strict bandwidth limitation in both laboratory and real fields.
ASJC Scopus subject areas