Alleviating overfitting for polysemous words for word representation estimation using lexicons

Yuanzhi Ke, Masafumi Hagiwara

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

Though there are some works on improving distributed word representations using lexicons, the improper over-fitting of the words that have multiple meanings is a remaining issue deteriorating the learning when lexicons are used, which needs to be solved. An alternative method is to allocate a vector per sense instead of a vector per word. However, the word representations estimated in the former way are not as easy to use as the latter one. Our previous work uses a probabilistic method to alleviate the overfitting, but it is not robust with a small corpus. In this paper, we propose a new neural network to estimate distributed word representations using a lexicon and a corpus. We add a lexicon layer in the continuous bag-of-words model and a threshold node after the output of the lexicon layer. The threshold rejects the unreliable outputs of the lexicon layer that are less likely to be the same with their inputs. In this way, it alleviates the overfitting of the polysemous words. The proposed neural network can be trained using negative sampling, which maximizing the log probabilities of target words given the context words, by distinguishing the target words from random noises. We compare the proposed neural network with the continuous bag-of-words model, the other works improving it, and the previous works estimating distributed word representations using both a lexicon and a corpus. The experimental results show that the proposed neural network is more efficient and balanced for both semantic tasks and syntactic tasks than the previous works, and robust to the size of the corpus.

Original languageEnglish
Title of host publication2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2164-2170
Number of pages7
Volume2017-May
ISBN (Electronic)9781509061815
DOIs
Publication statusPublished - 2017 Jun 30
Event2017 International Joint Conference on Neural Networks, IJCNN 2017 - Anchorage, United States
Duration: 2017 May 142017 May 19

Other

Other2017 International Joint Conference on Neural Networks, IJCNN 2017
CountryUnited States
CityAnchorage
Period17/5/1417/5/19

Fingerprint

Neural networks
Syntactics
Semantics
Sampling

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Cite this

Ke, Y., & Hagiwara, M. (2017). Alleviating overfitting for polysemous words for word representation estimation using lexicons. In 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings (Vol. 2017-May, pp. 2164-2170). [7966117] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IJCNN.2017.7966117

Alleviating overfitting for polysemous words for word representation estimation using lexicons. / Ke, Yuanzhi; Hagiwara, Masafumi.

2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May Institute of Electrical and Electronics Engineers Inc., 2017. p. 2164-2170 7966117.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Ke, Y & Hagiwara, M 2017, Alleviating overfitting for polysemous words for word representation estimation using lexicons. in 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. vol. 2017-May, 7966117, Institute of Electrical and Electronics Engineers Inc., pp. 2164-2170, 2017 International Joint Conference on Neural Networks, IJCNN 2017, Anchorage, United States, 17/5/14. https://doi.org/10.1109/IJCNN.2017.7966117
Ke Y, Hagiwara M. Alleviating overfitting for polysemous words for word representation estimation using lexicons. In 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May. Institute of Electrical and Electronics Engineers Inc. 2017. p. 2164-2170. 7966117 https://doi.org/10.1109/IJCNN.2017.7966117
Ke, Yuanzhi ; Hagiwara, Masafumi. / Alleviating overfitting for polysemous words for word representation estimation using lexicons. 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings. Vol. 2017-May Institute of Electrical and Electronics Engineers Inc., 2017. pp. 2164-2170
@inproceedings{e477d39fac364e8b8d049081ae34c3e4,
title = "Alleviating overfitting for polysemous words for word representation estimation using lexicons",
abstract = "Though there are some works on improving distributed word representations using lexicons, the improper over-fitting of the words that have multiple meanings is a remaining issue deteriorating the learning when lexicons are used, which needs to be solved. An alternative method is to allocate a vector per sense instead of a vector per word. However, the word representations estimated in the former way are not as easy to use as the latter one. Our previous work uses a probabilistic method to alleviate the overfitting, but it is not robust with a small corpus. In this paper, we propose a new neural network to estimate distributed word representations using a lexicon and a corpus. We add a lexicon layer in the continuous bag-of-words model and a threshold node after the output of the lexicon layer. The threshold rejects the unreliable outputs of the lexicon layer that are less likely to be the same with their inputs. In this way, it alleviates the overfitting of the polysemous words. The proposed neural network can be trained using negative sampling, which maximizing the log probabilities of target words given the context words, by distinguishing the target words from random noises. We compare the proposed neural network with the continuous bag-of-words model, the other works improving it, and the previous works estimating distributed word representations using both a lexicon and a corpus. The experimental results show that the proposed neural network is more efficient and balanced for both semantic tasks and syntactic tasks than the previous works, and robust to the size of the corpus.",
author = "Yuanzhi Ke and Masafumi Hagiwara",
year = "2017",
month = "6",
day = "30",
doi = "10.1109/IJCNN.2017.7966117",
language = "English",
volume = "2017-May",
pages = "2164--2170",
booktitle = "2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Alleviating overfitting for polysemous words for word representation estimation using lexicons

AU - Ke, Yuanzhi

AU - Hagiwara, Masafumi

PY - 2017/6/30

Y1 - 2017/6/30

N2 - Though there are some works on improving distributed word representations using lexicons, the improper over-fitting of the words that have multiple meanings is a remaining issue deteriorating the learning when lexicons are used, which needs to be solved. An alternative method is to allocate a vector per sense instead of a vector per word. However, the word representations estimated in the former way are not as easy to use as the latter one. Our previous work uses a probabilistic method to alleviate the overfitting, but it is not robust with a small corpus. In this paper, we propose a new neural network to estimate distributed word representations using a lexicon and a corpus. We add a lexicon layer in the continuous bag-of-words model and a threshold node after the output of the lexicon layer. The threshold rejects the unreliable outputs of the lexicon layer that are less likely to be the same with their inputs. In this way, it alleviates the overfitting of the polysemous words. The proposed neural network can be trained using negative sampling, which maximizing the log probabilities of target words given the context words, by distinguishing the target words from random noises. We compare the proposed neural network with the continuous bag-of-words model, the other works improving it, and the previous works estimating distributed word representations using both a lexicon and a corpus. The experimental results show that the proposed neural network is more efficient and balanced for both semantic tasks and syntactic tasks than the previous works, and robust to the size of the corpus.

AB - Though there are some works on improving distributed word representations using lexicons, the improper over-fitting of the words that have multiple meanings is a remaining issue deteriorating the learning when lexicons are used, which needs to be solved. An alternative method is to allocate a vector per sense instead of a vector per word. However, the word representations estimated in the former way are not as easy to use as the latter one. Our previous work uses a probabilistic method to alleviate the overfitting, but it is not robust with a small corpus. In this paper, we propose a new neural network to estimate distributed word representations using a lexicon and a corpus. We add a lexicon layer in the continuous bag-of-words model and a threshold node after the output of the lexicon layer. The threshold rejects the unreliable outputs of the lexicon layer that are less likely to be the same with their inputs. In this way, it alleviates the overfitting of the polysemous words. The proposed neural network can be trained using negative sampling, which maximizing the log probabilities of target words given the context words, by distinguishing the target words from random noises. We compare the proposed neural network with the continuous bag-of-words model, the other works improving it, and the previous works estimating distributed word representations using both a lexicon and a corpus. The experimental results show that the proposed neural network is more efficient and balanced for both semantic tasks and syntactic tasks than the previous works, and robust to the size of the corpus.

UR - http://www.scopus.com/inward/record.url?scp=85030999961&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85030999961&partnerID=8YFLogxK

U2 - 10.1109/IJCNN.2017.7966117

DO - 10.1109/IJCNN.2017.7966117

M3 - Conference contribution

VL - 2017-May

SP - 2164

EP - 2170

BT - 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -