We present a way of capturing the visual appearance of a real environment such as the interior of a room. We propose a method for generating arbitrary viewpoint images by constructing a light field with an omni-directional camera. In this method, the omni-directional camera positions of input image sequences are automatically estimated by extending Zhang's homography-based camera calibration method to omni-directional cameras. We also use a B-Tree data structure for the light field, to improve the efficiency of virtual view image synthesis. Thus our method allows the user to explore a virtual environment with a wide field of view that achieves a realistic representation. To demonstrate the proposed method, we captured our lab's interior with an omni-directional camera, and successfully generated arbitrary viewpoint images for a virtual tour of the lab environment.
|ジャーナル||Kyokai Joho Imeji Zasshi/Journal of the Institute of Image Information and Television Engineers|
|出版ステータス||Published - 2004 10月|
ASJC Scopus subject areas
- コンピュータ サイエンスの応用