论文标题

通过FINETUNING子字系统朝着合理大小的角色级变压器NMT

Towards Reasonably-Sized Character-Level Transformer NMT by Finetuning Subword Systems

论文作者

Libovický, Jindřich, Fraser, Alexander

论文摘要

在角色级别上应用变压器体系结构通常需要很难训练的非常深的体系结构。这些问题可以通过将分割纳入模型中的代币中部分克服。我们表明,通过最初训练子词模型,然后在字符上对其进行填充,我们可以获得一个神经机器翻译模型,该模型在字符级别上工作而无需令牌分割。我们仅使用Vanilla 6层变压器基础体系结构。我们的角色级模型可以更好地捕获形态学现象,并以较差的整体翻译质量差异,对噪声表现出更大的稳健性。我们的研究是迈向高性能和易于训练的基于角色的模型的重要一步,这些模型不是很大。

Applying the Transformer architecture on the character level usually requires very deep architectures that are difficult and slow to train. These problems can be partially overcome by incorporating a segmentation into tokens in the model. We show that by initially training a subword model and then finetuning it on characters, we can obtain a neural machine translation model that works at the character level without requiring token segmentation. We use only the vanilla 6-layer Transformer Base architecture. Our character-level models better capture morphological phenomena and show more robustness to noise at the expense of somewhat worse overall translation quality. Our study is a significant step towards high-performance and easy to train character-based models that are not extremely large.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源