This paper proposes a novel algorithm that calibrates multiple cameras scattered across a broad area. The key idea of the proposed method is “using the position of an omnidirectional camera as a reference point.” The common approach to calibrating multiple cameras assumes that the cameras capture at least some common points. This means calibration becomes quite difficult if there are no shared points in each camera’s field of view (FOV). The proposed method uses the position of an omnidirectional camera to determine point correspondence. The position of an omnidirectional camera relative to the calibrated camera is estimated by the theory of epipolar geometry, even if the omnidirectional camera is placed outside the camera’s FOV. This property makes our method applicable to multiple cameras scattered across a broad area. Qualitative and quantitative evaluations using synthesized and real data, e.g., a sports field, demonstrate the advantages of the proposed method.
|ジャーナル||IEEE Transactions on Circuits and Systems for Video Technology|
|出版ステータス||Accepted/In press - 2017 7月 25|
ASJC Scopus subject areas