Overfitting in quantum machine learning and entangling dropout

Masahiro Kobayashi, Kouhei Nakaji, Naoki Yamamoto

Research output: Contribution to journalArticlepeer-review

Abstract

The ultimate goal in machine learning is to construct a model function that has a generalization capability for unseen dataset, based on given training dataset. If the model function has too much expressibility power, then it may overfit to the training data and as a result lose the generalization capability. To avoid such overfitting issue, several techniques have been developed in the classical machine learning regime, and the dropout is one such effective method. This paper proposes a straightforward analogue of this technique in the quantum machine learning regime, the entangling dropout, meaning that some entangling gates in a given parametrized quantum circuit are randomly removed during the training process to reduce the expressibility of the circuit. Some simple case studies are given to show that this technique actually suppresses the overfitting.

Original languageEnglish
Article number30
JournalQuantum Machine Intelligence
Volume4
Issue number2
DOIs
Publication statusPublished - 2022 Dec

Keywords

  • Dropout regularization
  • Overfitting
  • Parametrized quantum circuit
  • Quantum machine learning

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Overfitting in quantum machine learning and entangling dropout'. Together they form a unique fingerprint.

Cite this