Motion estimation for non-overlapping cameras by improvement of feature points matehing based on urban 3D structure

Atsushi Kawasaki, Hideo Saito, Kosuke Hara

Research output: Contribution to journalArticle


We propose a method of ego-motion estimation for a self-driving car on which we installed cameras with non- overlapping views. By finding corresponding points between the multi-camera images, we aim to enhance the accuracy of the ego-motion estimation. However since the viewing directions are very different from one camera to the other, a conventional algorithm such as SURF cannot detect a sufficient number of correspondences. Additionally in the case where cameras have low frame rate and the vehicle has high speed, the scene changes might be too big to find correspondences between the same camera images. We propose a novel matching algorithm by warping feature patches detected in different cameras based on urban 3D structure. We assume that detected features exist on the surface of buildings or roads and the patch around the feature is planar. Based on this assumption, we can warp the patches so that the feature descriptors are similar for the corresponding feature points. We apply Bundle Adjustment to the found correspondences to optimize the odometry. The result shows higher estimation accuracy when compared to other matching methods.

Original languageEnglish
Pages (from-to)1146-1153
Number of pages8
JournalSeimitsu Kogaku Kaishi/Journal of the Japan Society for Precision Engineering
Issue number12
Publication statusPublished - 2015
Externally publishedYes



  • Bundle adjustment
  • Feature point matching
  • Motion estimation
  • Multi cameras
  • Slam
  • Warping

ASJC Scopus subject areas

  • Mechanical Engineering

Cite this