TY - JOUR
T1 - Overfitting in quantum machine learning and entangling dropout
AU - Kobayashi, Masahiro
AU - Nakaji, Kouhei
AU - Yamamoto, Naoki
N1 - Funding Information:
This work was supported by MEXT Quantum Leap Flagship Program Grants No. JPMXS0118067285 and No. JPMXS0120319794, and also Grant-in-Aid for JSPS Research Fellow Grant No. 22J01501.
Publisher Copyright:
© 2022, The Author(s), under exclusive licence to Springer Nature Switzerland AG.
PY - 2022/12
Y1 - 2022/12
N2 - The ultimate goal in machine learning is to construct a model function that has a generalization capability for unseen dataset, based on given training dataset. If the model function has too much expressibility power, then it may overfit to the training data and as a result lose the generalization capability. To avoid such overfitting issue, several techniques have been developed in the classical machine learning regime, and the dropout is one such effective method. This paper proposes a straightforward analogue of this technique in the quantum machine learning regime, the entangling dropout, meaning that some entangling gates in a given parametrized quantum circuit are randomly removed during the training process to reduce the expressibility of the circuit. Some simple case studies are given to show that this technique actually suppresses the overfitting.
AB - The ultimate goal in machine learning is to construct a model function that has a generalization capability for unseen dataset, based on given training dataset. If the model function has too much expressibility power, then it may overfit to the training data and as a result lose the generalization capability. To avoid such overfitting issue, several techniques have been developed in the classical machine learning regime, and the dropout is one such effective method. This paper proposes a straightforward analogue of this technique in the quantum machine learning regime, the entangling dropout, meaning that some entangling gates in a given parametrized quantum circuit are randomly removed during the training process to reduce the expressibility of the circuit. Some simple case studies are given to show that this technique actually suppresses the overfitting.
KW - Dropout regularization
KW - Overfitting
KW - Parametrized quantum circuit
KW - Quantum machine learning
UR - http://www.scopus.com/inward/record.url?scp=85142937035&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85142937035&partnerID=8YFLogxK
U2 - 10.1007/s42484-022-00087-9
DO - 10.1007/s42484-022-00087-9
M3 - Article
AN - SCOPUS:85142937035
SN - 2524-4906
VL - 4
JO - Quantum Machine Intelligence
JF - Quantum Machine Intelligence
IS - 2
M1 - 30
ER -