Empirical study of future image prediction for image-based mobile robot navigation

Yu Ishihara, Masaki Takahashi

Research output: Contribution to journalArticlepeer-review

Abstract

Recent image-based robotic systems use predicted future state images to control robots. Therefore, the prediction accuracy of the future state image affects the performance of the robot. To predict images, most previous studies assume that the camera captures the entire scene and that the environment is static. However, in real robot applications, these assumptions do not always hold. For example, if a camera is attached to a mobile robot, its view changes from time to time. In this study, we analyzed the relationship between the performance of the image prediction model and the robot's behavior, controlled by an image-based navigation algorithm. Through mobile robot navigation experiments using front-faced and omni-directional cameras, we discussed the capabilities of the image prediction models and demonstrated their performance when applied to the image-based navigation algorithm. Moreover, to adapt to the dynamic changes in the environment, we studied the effectiveness of directing the camera to the ceiling. We showed that robust navigation can be achieved without using images from cameras directed toward the front or the floor, because these views can be disturbed by moving objects in a dynamic environment.

Original languageEnglish
Article number104018
JournalRobotics and Autonomous Systems
Volume150
DOIs
Publication statusPublished - 2022 Apr

Keywords

  • Action-conditioned image prediction
  • Mobile robot
  • Omni-directional camera

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Mathematics(all)
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Empirical study of future image prediction for image-based mobile robot navigation'. Together they form a unique fingerprint.

Cite this