Extrinsic Camera Calibration Without Visible Corresponding Points Using Omnidirectional Cameras

Shogo Miyata, Hideo Saito, Kosuke Takahashi, Dan Mikami, Mariko Isogawa, Akira Kojima

Research output: Contribution to journalArticle

5 Citations (Scopus)


This paper proposes a novel algorithm that calibrates multiple cameras scattered across a broad area. The key idea of the proposed method is “using the position of an omnidirectional camera as a reference point.” The common approach to calibrating multiple cameras assumes that the cameras capture at least some common points. This means calibration becomes quite difficult if there are no shared points in each camera’s field of view (FOV). The proposed method uses the position of an omnidirectional camera to determine point correspondence. The position of an omnidirectional camera relative to the calibrated camera is estimated by the theory of epipolar geometry, even if the omnidirectional camera is placed outside the camera’s FOV. This property makes our method applicable to multiple cameras scattered across a broad area. Qualitative and quantitative evaluations using synthesized and real data, e.g., a sports field, demonstrate the advantages of the proposed method.

Original languageEnglish
JournalIEEE Transactions on Circuits and Systems for Video Technology
Publication statusAccepted/In press - 2017 Jul 25



  • Calibration
  • Camera Calibration
  • Cameras
  • Fixtures
  • Mirrors
  • Non-overlapping Cameras
  • Omnidirectional Camera
  • Robot vision systems
  • Solid modeling
  • Three-dimensional displays

ASJC Scopus subject areas

  • Media Technology
  • Electrical and Electronic Engineering

Cite this