Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation

Takehiro Ozawa, Yusuke Sekikawa, Hideo Saito

研究成果: Article査読

6 被引用数 (Scopus)


Event cameras are bio-inspired sensors that have a high dynamic range and temporal resolution. This property enables motion estimation from textures with repeating patterns, which is difficult to achieve with RGB cameras. Therefore, motion estimation of an event camera is expected to be applied to vehicle position estimation. An existing method, called contrast maximization, is one of the methods that can be used for event camera motion estimation by capturing road surfaces. However, contrast maximization tends to fall into a local solution when estimating three-dimensional motion, which makes correct estimation difficult. To solve this problem, we propose a method for motion estimation by optimizing contrast in the bird’s-eye view space. Instead of performing three-dimensional motion estimation, we reduced the dimensionality to two-dimensional motion estimation by transforming the event data to a bird’s-eye view using homography calculated from the event camera position. This transformation mitigates the problem of the loss function becoming non-convex, which occurs in conventional methods. As a quantitative experiment, we created event data by using a car simulator and evaluated our motion estimation method, showing an improvement in accuracy and speed. In addition, we conducted estimation from real event data and evaluated the results qualitatively, showing an improvement in accuracy.

出版ステータスPublished - 2022 2月 1

ASJC Scopus subject areas

  • 分析化学
  • 情報システム
  • 器械工学
  • 原子分子物理学および光学
  • 電子工学および電気工学
  • 生化学


「Accuracy and Speed Improvement of Event Camera Motion Estimation Using a Bird’s-Eye View Transformation」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。