Abstract
We present a way of capturing the visual appearance of a real environment such as the interior of a room. We propose a method for generating arbitrary viewpoint images by constructing a light field with an omni-directional camera. In this method, the omni-directional camera positions of input image sequences are automatically estimated by extending Zhang's homography-based camera calibration method to omni-directional cameras. We also use a B-Tree data structure for the light field, to improve the efficiency of virtual view image synthesis. Thus our method allows the user to explore a virtual environment with a wide field of view that achieves a realistic representation. To demonstrate the proposed method, we captured our lab's interior with an omni-directional camera, and successfully generated arbitrary viewpoint images for a virtual tour of the lab environment.
Original language | English |
---|---|
Pages (from-to) | 1489-1496 |
Number of pages | 8 |
Journal | Kyokai Joho Imeji Zasshi/Journal of the Institute of Image Information and Television Engineers |
Volume | 58 |
Issue number | 10 |
DOIs | |
Publication status | Published - 2004 Oct |
ASJC Scopus subject areas
- Media Technology
- Computer Science Applications
- Electrical and Electronic Engineering