Visual Explanation Generation Based on Lambda Attention Branch Networks

Tsumugi Iida, Takumi Komatsu, Kanta Kaneda, Tsubasa Hirakawa, Takayoshi Yamashita, Hironobu Fujiyoshi, Komei Sugiura

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Explanation generation for transformers enhances accountability for their predictions. However, there have been few studies on generating visual explanations for the transformers that use multidimensional context, such as LambdaNetworks. In this paper, we propose the Lambda Attention Branch Networks, which attend to important regions in detail and generate easily interpretable visual explanations. We also propose the Patch Insertion-Deletion score, an extension of the Insertion-Deletion score, as an effective evaluation metric for images with sparse important regions. Experimental results on two public datasets indicate that the proposed method successfully generates visual explanations.

Original languageEnglish
Title of host publicationComputer Vision – ACCV 2022 - 16th Asian Conference on Computer Vision, 2022, Proceedings
EditorsLei Wang, Juergen Gall, Tat-Jun Chin, Imari Sato, Rama Chellappa
PublisherSpringer Science and Business Media Deutschland GmbH
Pages475-490
Number of pages16
ISBN (Print)9783031262838
DOIs
Publication statusPublished - 2023
Event16th Asian Conference on Computer Vision, ACCV 2022 - Macao, China
Duration: 2022 Dec 42022 Dec 8

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13842 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference16th Asian Conference on Computer Vision, ACCV 2022
Country/TerritoryChina
CityMacao
Period22/12/422/12/8

Keywords

  • Attention
  • Lambda networks
  • Transformer

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Visual Explanation Generation Based on Lambda Attention Branch Networks'. Together they form a unique fingerprint.

Cite this