Vehicle pose estimation from drive recorder images by monocular SLAM and matching with rendered 3D point cloud of surrounding environment

Akiyoshi Kurobe, Hisashi Kinoshita, Hideo Saito

Research output: Contribution to journalConference article

Abstract

Vehicle pose estimation is a vital technology for reconstructing the circumstaces of traffic accidents. We propose a novel method for reconstructing the trajectory of vehicles from drive recorder images and a point cloud around the road. First, we apply ORB-SLAM to image sequence of the drive recorder for obtaining the vehicle pose trajectory; however this is based on relative coordinates and a relative scale. For estimating the absolute coordinates and scale of the trajectory, which cannot be obtained from a monocular SLAM like ORB-SLAM, we match the feature points detected in the image sequence with the three-dimensional (3D) point cloud of surrounding environment. For finding 3D points matching the feature points, we generate candidate images by the rendering 3D point cloud of the surrounding environment using the position initially estimated by the Global Positioning System (GPS). Next, we match to obtain the 3D two-dimensional (2D) generated images and drive recorder image to get 3D-2D point correspondences between the 3D point cloud and the drive recorder images; thus, we can convert the relative estimation of the camera pose by ORB-SLAM to the coordinates of the 3D point cloud of the surrounding environment. In the evaluation experiments, we confirmed the effectiveness of our method by comparing the vehicle poses estimated by our method, with those of RTKGPS, which exhibits high measurement precision.

Original languageEnglish
Pages (from-to)2371-2376
Number of pages6
JournalIS and T International Symposium on Electronic Imaging Science and Technology
VolumePart F138660
DOIs
Publication statusPublished - 2018 Jan 1
EventIntelligent Robotics and Industrial Applications using Computer Vision 2018, IRIACV 2018 - Burlingame, United States
Duration: 2018 Jan 282018 Feb 1

Fingerprint

supersonic low altitude missile
recorders
vehicles
Trajectories
trajectories
Highway accidents
Global positioning system
Cameras
Global Positioning System
accidents
roads
traffic
estimating
cameras
Experiments

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Computer Science Applications
  • Human-Computer Interaction
  • Software
  • Electrical and Electronic Engineering
  • Atomic and Molecular Physics, and Optics

Cite this

Vehicle pose estimation from drive recorder images by monocular SLAM and matching with rendered 3D point cloud of surrounding environment. / Kurobe, Akiyoshi; Kinoshita, Hisashi; Saito, Hideo.

In: IS and T International Symposium on Electronic Imaging Science and Technology, Vol. Part F138660, 01.01.2018, p. 2371-2376.

Research output: Contribution to journalConference article

@article{5bdf0f3a32c84f3e8b85745c4fb65a4e,
title = "Vehicle pose estimation from drive recorder images by monocular SLAM and matching with rendered 3D point cloud of surrounding environment",
abstract = "Vehicle pose estimation is a vital technology for reconstructing the circumstaces of traffic accidents. We propose a novel method for reconstructing the trajectory of vehicles from drive recorder images and a point cloud around the road. First, we apply ORB-SLAM to image sequence of the drive recorder for obtaining the vehicle pose trajectory; however this is based on relative coordinates and a relative scale. For estimating the absolute coordinates and scale of the trajectory, which cannot be obtained from a monocular SLAM like ORB-SLAM, we match the feature points detected in the image sequence with the three-dimensional (3D) point cloud of surrounding environment. For finding 3D points matching the feature points, we generate candidate images by the rendering 3D point cloud of the surrounding environment using the position initially estimated by the Global Positioning System (GPS). Next, we match to obtain the 3D two-dimensional (2D) generated images and drive recorder image to get 3D-2D point correspondences between the 3D point cloud and the drive recorder images; thus, we can convert the relative estimation of the camera pose by ORB-SLAM to the coordinates of the 3D point cloud of the surrounding environment. In the evaluation experiments, we confirmed the effectiveness of our method by comparing the vehicle poses estimated by our method, with those of RTKGPS, which exhibits high measurement precision.",
author = "Akiyoshi Kurobe and Hisashi Kinoshita and Hideo Saito",
year = "2018",
month = "1",
day = "1",
doi = "10.2352/ISSN.2470-1173.2018.09.AVM-283",
language = "English",
volume = "Part F138660",
pages = "2371--2376",
journal = "IS and T International Symposium on Electronic Imaging Science and Technology",
issn = "2470-1173",

}

TY - JOUR

T1 - Vehicle pose estimation from drive recorder images by monocular SLAM and matching with rendered 3D point cloud of surrounding environment

AU - Kurobe, Akiyoshi

AU - Kinoshita, Hisashi

AU - Saito, Hideo

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Vehicle pose estimation is a vital technology for reconstructing the circumstaces of traffic accidents. We propose a novel method for reconstructing the trajectory of vehicles from drive recorder images and a point cloud around the road. First, we apply ORB-SLAM to image sequence of the drive recorder for obtaining the vehicle pose trajectory; however this is based on relative coordinates and a relative scale. For estimating the absolute coordinates and scale of the trajectory, which cannot be obtained from a monocular SLAM like ORB-SLAM, we match the feature points detected in the image sequence with the three-dimensional (3D) point cloud of surrounding environment. For finding 3D points matching the feature points, we generate candidate images by the rendering 3D point cloud of the surrounding environment using the position initially estimated by the Global Positioning System (GPS). Next, we match to obtain the 3D two-dimensional (2D) generated images and drive recorder image to get 3D-2D point correspondences between the 3D point cloud and the drive recorder images; thus, we can convert the relative estimation of the camera pose by ORB-SLAM to the coordinates of the 3D point cloud of the surrounding environment. In the evaluation experiments, we confirmed the effectiveness of our method by comparing the vehicle poses estimated by our method, with those of RTKGPS, which exhibits high measurement precision.

AB - Vehicle pose estimation is a vital technology for reconstructing the circumstaces of traffic accidents. We propose a novel method for reconstructing the trajectory of vehicles from drive recorder images and a point cloud around the road. First, we apply ORB-SLAM to image sequence of the drive recorder for obtaining the vehicle pose trajectory; however this is based on relative coordinates and a relative scale. For estimating the absolute coordinates and scale of the trajectory, which cannot be obtained from a monocular SLAM like ORB-SLAM, we match the feature points detected in the image sequence with the three-dimensional (3D) point cloud of surrounding environment. For finding 3D points matching the feature points, we generate candidate images by the rendering 3D point cloud of the surrounding environment using the position initially estimated by the Global Positioning System (GPS). Next, we match to obtain the 3D two-dimensional (2D) generated images and drive recorder image to get 3D-2D point correspondences between the 3D point cloud and the drive recorder images; thus, we can convert the relative estimation of the camera pose by ORB-SLAM to the coordinates of the 3D point cloud of the surrounding environment. In the evaluation experiments, we confirmed the effectiveness of our method by comparing the vehicle poses estimated by our method, with those of RTKGPS, which exhibits high measurement precision.

UR - http://www.scopus.com/inward/record.url?scp=85052856898&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85052856898&partnerID=8YFLogxK

U2 - 10.2352/ISSN.2470-1173.2018.09.AVM-283

DO - 10.2352/ISSN.2470-1173.2018.09.AVM-283

M3 - Conference article

AN - SCOPUS:85052856898

VL - Part F138660

SP - 2371

EP - 2376

JO - IS and T International Symposium on Electronic Imaging Science and Technology

JF - IS and T International Symposium on Electronic Imaging Science and Technology

SN - 2470-1173

ER -