Arbitrary Viewpoint Video Synthesis from Multiple Uncalibrated Cameras

Satoshi Yaguchi, Hideo Saito

Research output: Contribution to journalArticle

19 Citations (Scopus)

Abstract

We propose a method for arbitrary view synthesis from uncalibrated multiple camera system, targeting large spaces such as soccer stadiums. In Projective Grid Space (PGS), which is a three-dimensional space defined by epipolar geometry between two basis cameras in the camera system, we reconstruct three-dimensional shape models from silhouette images. Using the three-dimensional shape models reconstructed in the PGS, we obtain a dense map of the point correspondence between reference images. The obtained correspondence can synthesize the image of arbitrary view between the reference images. We also propose a method for merging the synthesized images with the virtual background scene in the PGS. We apply the proposed methods to image sequences taken by a multiple camera system, which installed in a large concert hall. The synthesized image sequences of virtual camera have enough quality to demonstrate effectiveness of the proposed method.

Original languageEnglish
Pages (from-to)430-439
Number of pages10
JournalIEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Volume34
Issue number1
DOIs
Publication statusPublished - 2004 Feb 1

    Fingerprint

Keywords

  • Fundamental matrix
  • Projective geometry
  • Projective grid space
  • Shape from multiple cameras
  • View interpolation
  • Virtual view synthesis

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Information Systems
  • Human-Computer Interaction
  • Computer Science Applications
  • Electrical and Electronic Engineering

Cite this