Virtual presentation of wide environment by constructing Omni-directional light field

Hiroshi Todoroki, Hideo Saito

Research output: Contribution to journalArticle

Abstract

We present a way of capturing the visual appearance of a real environment such as the interior of a room. We propose a method for generating arbitrary viewpoint images by constructing a light field with an omni-directional camera. In this method, the omni-directional camera positions of input image sequences are automatically estimated by extending Zhang's homography-based camera calibration method to omni-directional cameras. We also use a B-Tree data structure for the light field, to improve the efficiency of virtual view image synthesis. Thus our method allows the user to explore a virtual environment with a wide field of view that achieves a realistic representation. To demonstrate the proposed method, we captured our lab's interior with an omni-directional camera, and successfully generated arbitrary viewpoint images for a virtual tour of the lab environment.

Original languageEnglish
Pages (from-to)1489-1496
Number of pages8
JournalKyokai Joho Imeji Zasshi/Journal of the Institute of Image Information and Television Engineers
Volume58
Issue number10
Publication statusPublished - 2004 Oct

Fingerprint

Cameras
Virtual reality
Data structures
Calibration

ASJC Scopus subject areas

  • Electronic, Optical and Magnetic Materials
  • Computer Vision and Pattern Recognition

Cite this

@article{9859f4bcaefd4a1cb625f58a7ce7937b,
title = "Virtual presentation of wide environment by constructing Omni-directional light field",
abstract = "We present a way of capturing the visual appearance of a real environment such as the interior of a room. We propose a method for generating arbitrary viewpoint images by constructing a light field with an omni-directional camera. In this method, the omni-directional camera positions of input image sequences are automatically estimated by extending Zhang's homography-based camera calibration method to omni-directional cameras. We also use a B-Tree data structure for the light field, to improve the efficiency of virtual view image synthesis. Thus our method allows the user to explore a virtual environment with a wide field of view that achieves a realistic representation. To demonstrate the proposed method, we captured our lab's interior with an omni-directional camera, and successfully generated arbitrary viewpoint images for a virtual tour of the lab environment.",
author = "Hiroshi Todoroki and Hideo Saito",
year = "2004",
month = "10",
language = "English",
volume = "58",
pages = "1489--1496",
journal = "Kyokai Joho Imeji Zasshi/Journal of the Institute of Image Information and Television Engineers",
issn = "1342-6907",
publisher = "Institute of Image Information and Television Engineers",
number = "10",

}

TY - JOUR

T1 - Virtual presentation of wide environment by constructing Omni-directional light field

AU - Todoroki, Hiroshi

AU - Saito, Hideo

PY - 2004/10

Y1 - 2004/10

N2 - We present a way of capturing the visual appearance of a real environment such as the interior of a room. We propose a method for generating arbitrary viewpoint images by constructing a light field with an omni-directional camera. In this method, the omni-directional camera positions of input image sequences are automatically estimated by extending Zhang's homography-based camera calibration method to omni-directional cameras. We also use a B-Tree data structure for the light field, to improve the efficiency of virtual view image synthesis. Thus our method allows the user to explore a virtual environment with a wide field of view that achieves a realistic representation. To demonstrate the proposed method, we captured our lab's interior with an omni-directional camera, and successfully generated arbitrary viewpoint images for a virtual tour of the lab environment.

AB - We present a way of capturing the visual appearance of a real environment such as the interior of a room. We propose a method for generating arbitrary viewpoint images by constructing a light field with an omni-directional camera. In this method, the omni-directional camera positions of input image sequences are automatically estimated by extending Zhang's homography-based camera calibration method to omni-directional cameras. We also use a B-Tree data structure for the light field, to improve the efficiency of virtual view image synthesis. Thus our method allows the user to explore a virtual environment with a wide field of view that achieves a realistic representation. To demonstrate the proposed method, we captured our lab's interior with an omni-directional camera, and successfully generated arbitrary viewpoint images for a virtual tour of the lab environment.

UR - http://www.scopus.com/inward/record.url?scp=9944251810&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=9944251810&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:9944251810

VL - 58

SP - 1489

EP - 1496

JO - Kyokai Joho Imeji Zasshi/Journal of the Institute of Image Information and Television Engineers

JF - Kyokai Joho Imeji Zasshi/Journal of the Institute of Image Information and Television Engineers

SN - 1342-6907

IS - 10

ER -