3DTV view generation using uncalibrated cameras

Songkran Jarusirisawad, Hideo Saito

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

This paper proposes a method for synthesizing free viewpoint video which is captured by uncalibrated multiple cameras. Each cameras are allowed to be zoomed and rotated freely during capture. Neither intrinsic nor extrinsic parameters of our cameras are known. Projective Grid Space (PGS), which is the 3D space defined by the epipolar geometry of two basis cameras, is employed for calibrating dynamic multiple cameras, because geometrical relations among cameras in PGS are obtained from 2D-2D corresponding points between views. We utilize Keypoint Recognition for finding corresponding points in natural scene for registering cameras to PGS. Moving object is segmented via graph cut optimization. Finally, free viewpoint video is synthesized based on the reconstructed visual hull. In the experimental results, free viewpoint video which is captured by uncalibrated cameras is successfully synthesized using the proposed method.

Original languageEnglish
Title of host publication2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008 Proceedings
Pages57-60
Number of pages4
DOIs
Publication statusPublished - 2008
Event2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008 - Istanbul, Turkey
Duration: 2008 May 282008 May 30

Other

Other2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008
CountryTurkey
CityIstanbul
Period08/5/2808/5/30

Fingerprint

Cameras
Geometry

Keywords

  • Calibration
  • Image registration
  • Image synthesis

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Cite this

Jarusirisawad, S., & Saito, H. (2008). 3DTV view generation using uncalibrated cameras. In 2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008 Proceedings (pp. 57-60). [4547807] https://doi.org/10.1109/3DTV.2008.4547807

3DTV view generation using uncalibrated cameras. / Jarusirisawad, Songkran; Saito, Hideo.

2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008 Proceedings. 2008. p. 57-60 4547807.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Jarusirisawad, S & Saito, H 2008, 3DTV view generation using uncalibrated cameras. in 2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008 Proceedings., 4547807, pp. 57-60, 2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008, Istanbul, Turkey, 08/5/28. https://doi.org/10.1109/3DTV.2008.4547807
Jarusirisawad S, Saito H. 3DTV view generation using uncalibrated cameras. In 2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008 Proceedings. 2008. p. 57-60. 4547807 https://doi.org/10.1109/3DTV.2008.4547807
Jarusirisawad, Songkran ; Saito, Hideo. / 3DTV view generation using uncalibrated cameras. 2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008 Proceedings. 2008. pp. 57-60
@inproceedings{7fa68ab3829a492889e29bfa11384365,
title = "3DTV view generation using uncalibrated cameras",
abstract = "This paper proposes a method for synthesizing free viewpoint video which is captured by uncalibrated multiple cameras. Each cameras are allowed to be zoomed and rotated freely during capture. Neither intrinsic nor extrinsic parameters of our cameras are known. Projective Grid Space (PGS), which is the 3D space defined by the epipolar geometry of two basis cameras, is employed for calibrating dynamic multiple cameras, because geometrical relations among cameras in PGS are obtained from 2D-2D corresponding points between views. We utilize Keypoint Recognition for finding corresponding points in natural scene for registering cameras to PGS. Moving object is segmented via graph cut optimization. Finally, free viewpoint video is synthesized based on the reconstructed visual hull. In the experimental results, free viewpoint video which is captured by uncalibrated cameras is successfully synthesized using the proposed method.",
keywords = "Calibration, Image registration, Image synthesis",
author = "Songkran Jarusirisawad and Hideo Saito",
year = "2008",
doi = "10.1109/3DTV.2008.4547807",
language = "English",
isbn = "9781424417551",
pages = "57--60",
booktitle = "2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008 Proceedings",

}

TY - GEN

T1 - 3DTV view generation using uncalibrated cameras

AU - Jarusirisawad, Songkran

AU - Saito, Hideo

PY - 2008

Y1 - 2008

N2 - This paper proposes a method for synthesizing free viewpoint video which is captured by uncalibrated multiple cameras. Each cameras are allowed to be zoomed and rotated freely during capture. Neither intrinsic nor extrinsic parameters of our cameras are known. Projective Grid Space (PGS), which is the 3D space defined by the epipolar geometry of two basis cameras, is employed for calibrating dynamic multiple cameras, because geometrical relations among cameras in PGS are obtained from 2D-2D corresponding points between views. We utilize Keypoint Recognition for finding corresponding points in natural scene for registering cameras to PGS. Moving object is segmented via graph cut optimization. Finally, free viewpoint video is synthesized based on the reconstructed visual hull. In the experimental results, free viewpoint video which is captured by uncalibrated cameras is successfully synthesized using the proposed method.

AB - This paper proposes a method for synthesizing free viewpoint video which is captured by uncalibrated multiple cameras. Each cameras are allowed to be zoomed and rotated freely during capture. Neither intrinsic nor extrinsic parameters of our cameras are known. Projective Grid Space (PGS), which is the 3D space defined by the epipolar geometry of two basis cameras, is employed for calibrating dynamic multiple cameras, because geometrical relations among cameras in PGS are obtained from 2D-2D corresponding points between views. We utilize Keypoint Recognition for finding corresponding points in natural scene for registering cameras to PGS. Moving object is segmented via graph cut optimization. Finally, free viewpoint video is synthesized based on the reconstructed visual hull. In the experimental results, free viewpoint video which is captured by uncalibrated cameras is successfully synthesized using the proposed method.

KW - Calibration

KW - Image registration

KW - Image synthesis

UR - http://www.scopus.com/inward/record.url?scp=50949119921&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=50949119921&partnerID=8YFLogxK

U2 - 10.1109/3DTV.2008.4547807

DO - 10.1109/3DTV.2008.4547807

M3 - Conference contribution

SN - 9781424417551

SP - 57

EP - 60

BT - 2008 3DTV-Conference: The True Vision - Capture, Transmission and Display of 3D Video, 3DTV-CON 2008 Proceedings

ER -