论文标题
对跳过模型的学习规则的分析
An Analysis on the Learning Rules of the Skip-Gram Model
论文作者
论文摘要
为了改善自然语言处理任务的表示形式的概括,通常使用向量表示单词,其中向量之间的距离与单词的相似性有关。尽管Skip-Gram模型的最新实现Word2Vec被广泛使用并改善了许多自然语言处理任务的性能,但其机制尚未得到充分理解。 在这项工作中,我们得出了Skip-gram模型的学习规则,并与竞争学习建立了密切的关系。此外,我们为Skip-Gram模型提供了全局最佳解决方案约束,并通过实验结果验证它们。
To improve the generalization of the representations for natural language processing tasks, words are commonly represented using vectors, where distances among the vectors are related to the similarity of the words. While word2vec, the state-of-the-art implementation of the skip-gram model, is widely used and improves the performance of many natural language processing tasks, its mechanism is not yet well understood. In this work, we derive the learning rules for the skip-gram model and establish their close relationship to competitive learning. In addition, we provide the global optimal solution constraints for the skip-gram model and validate them by experimental results.