Optical flow estimation by matching time surface with event-based cameras

Jun Nagata, Yusuke Sekikawa, Yoshimitsu Aoki

Research output: Contribution to journalArticlepeer-review

Abstract

In this work, we propose a novel method of estimating optical flow from event-based cameras by matching the time surface of events. The proposed loss function measures the timestamp consistency between the time surface formed by the latest timestamp of each pixel and the one that is slightly shifted in time. This makes it possible to estimate dense optical flows with high accuracy without restoring luminance or additional sensor information. In the experiment, we show that the gradient was more correct and the loss landscape was more stable than the variance loss in the motion compensation approach. In addition, we show that the optical flow can be estimated with high accuracy by optimization with L1 smoothness regularization using publicly available datasets.

Original languageEnglish
Article number1150
Pages (from-to)1-14
Number of pages14
JournalSensors (Switzerland)
Volume21
Issue number4
DOIs
Publication statusPublished - 2021 Feb 2

Keywords

  • Event-based camera
  • Optical flow

ASJC Scopus subject areas

  • Analytical Chemistry
  • Biochemistry
  • Atomic and Molecular Physics, and Optics
  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Optical flow estimation by matching time surface with event-based cameras'. Together they form a unique fingerprint.

Cite this