Transfer Learning with Sparse Associative Memories

Quentin Jodelet, Vincent Gripon, Masafumi Hagiwara

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In this paper, we introduce a novel layer designed to be used as the output of pre-trained neural networks in the context of classification. Based on Associative Memories, this layer can help design deep neural networks which support incremental learning and that can be (partially) trained in real time on embedded devices. Experiments on the ImageNet dataset and other different domain specific datasets show that it is possible to design more flexible and faster-to-train Neural Networks at the cost of a slight decrease in accuracy.

Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning – ICANN 2019
Subtitle of host publicationTheoretical Neural Computation - 28th International Conference on Artificial Neural Networks, 2019, Proceedings
EditorsIgor V. Tetko, Pavel Karpov, Fabian Theis, Vera Kurková
PublisherSpringer Verlag
Pages497-512
Number of pages16
ISBN (Print)9783030304867
DOIs
Publication statusPublished - 2019
Event28th International Conference on Artificial Neural Networks, ICANN 2019 - Munich, Germany
Duration: 2019 Sept 172019 Sept 19

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11727 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference28th International Conference on Artificial Neural Networks, ICANN 2019
Country/TerritoryGermany
CityMunich
Period19/9/1719/9/19

Keywords

  • Associative Memories
  • Computer vision
  • Deep learning
  • Incremental learning
  • Neural Networks
  • Self-organizing Maps
  • Transfer learning

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'Transfer Learning with Sparse Associative Memories'. Together they form a unique fingerprint.

Cite this