3D shape reconstructing system from multiple view images using octree and silhouette

Daisuke Iso, Hideo Saito, Shinji Ozawa

Research output: Contribution to journalArticlepeer-review

Abstract

In this paper, we describe the 3D shape reconstructing system from multiple view images using octree and silhouette. Our system consists of four calibrated cameras. Each camera is connected to a PC that locally extracts the silhouettes from the image captured by the camera. The four silhouette images and camera images are then sent to host computer to perform 3D reconstruction. For making the reconstruction faster, the object 3D space is represented by octree structure. If an octant does not entirely consist of the same type of voxels, then it is further subdivided until homogeneous cubes, possibly single voxels, are obtained. Allocating these cubes, and projecting them into all silhouette images, we perform the intersection of the projected cube region with silhouette region. We develop a new algorithm for fast speed constructing octree. The algorithm can reduce time complexity to check if a node should project 8 cube vertices to image plane, using a stack that keeps parents' temporary cube type. By using our algorithm, our system runs in semi real time computation, (about 5 frames per second) for generating 3D shape of the human in voxel representation.

Original languageEnglish
Pages (from-to)115-124
Number of pages10
JournalProceedings of SPIE-The International Society for Optical Engineering
Volume4572
DOIs
Publication statusPublished - 2001 Jan 1

Keywords

  • 3D Reconstruction
  • Camera Calibration
  • Multiple View Images
  • Octree
  • Shape from Silhouette

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Condensed Matter Physics
  • Computer Science Applications
  • Applied Mathematics
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of '3D shape reconstructing system from multiple view images using octree and silhouette'. Together they form a unique fingerprint.

Cite this