Joint inpainting of RGB and depth images by generative adversarial network with a late fusion approach

Ryo Fujii, Ryo Hachiuma, Hideo Saito

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Image inpainting aims to restore texture of missing regions in scene from an RGB image. In this paper, we aim to restore not only the texture but also the geometry of the missing regions in scene from a pair of RGB and depth images. Inspired by the recent development of generative adversarial network, we employ an encoder-decoderbased generative adversarial network with the input of RGB and depth image. The experimental results show that our method restores the missing region of both RGB and depth image.

Original languageEnglish
Title of host publicationAdjunct Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages203-204
Number of pages2
ISBN (Electronic)9781728147659
DOIs
Publication statusPublished - 2019 Oct
Event18th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019 - Beijing, China
Duration: 2019 Oct 142019 Oct 18

Publication series

NameAdjunct Proceedings of the 2019 IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019

Conference

Conference18th IEEE International Symposium on Mixed and Augmented Reality, ISMAR-Adjunct 2019
Country/TerritoryChina
CityBeijing
Period19/10/1419/10/18

Keywords

  • Artificial-intelligence
  • Computer-graphics
  • Computer-vision
  • Computer-vision-Tasks
  • Graphics-systems-And-interfaces
  • Mixed-/-Augmented-reality
  • Scene-understanding

ASJC Scopus subject areas

  • Computer Science Applications
  • Human-Computer Interaction
  • Media Technology

Fingerprint

Dive into the research topics of 'Joint inpainting of RGB and depth images by generative adversarial network with a late fusion approach'. Together they form a unique fingerprint.

Cite this