A point-based rendering application with a high-speed spatial interpolation

Masafumi Nakagawa, Satoshi Kuramochi, Anna Nakanishi, Yuta Sugaya

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Accurate 3-D scanning techniques are used for various applications. However, a viewpoint translation in point-cloud rendering reduces the visualization quality because of noticeable occlusion exposure and a noticeably uneven point distribution. In addition, recent increases in scanning speed and scanning data volume require a large video memory and high-frequency CPU. Therefore, we have developed a point-based rendering application with a high-speed spatial interpolation using a point cloud to generate virtual reality (VR) data called LiDAR VR. We have focused on the following features. First, panorama imagery can be generated automatically from arbitrary viewpoints from one observation data set. Second, it is possible that a spatial interpolation with rendering results can reduce the processing task. These features have been verified using case studies.

Original languageEnglish
Title of host publication32nd Asian Conference on Remote Sensing 2011, ACRS 2011
Pages1571-1574
Number of pages4
Volume3
Publication statusPublished - 2011
Event32nd Asian Conference on Remote Sensing 2011, ACRS 2011 - Tapei
Duration: 2011 Oct 32011 Oct 7

Other

Other32nd Asian Conference on Remote Sensing 2011, ACRS 2011
CityTapei
Period11/10/311/10/7

    Fingerprint

Keywords

  • LiDAR
  • Panorama imagery
  • Point cloud
  • Point-based rendering
  • Terrestrial 3D mapping
  • VR

ASJC Scopus subject areas

  • Computer Networks and Communications

Cite this

Nakagawa, M., Kuramochi, S., Nakanishi, A., & Sugaya, Y. (2011). A point-based rendering application with a high-speed spatial interpolation. In 32nd Asian Conference on Remote Sensing 2011, ACRS 2011 (Vol. 3, pp. 1571-1574)