论文标题
邻居足够吗?多头神经n-gram可以替代自我注意
Are Neighbors Enough? Multi-Head Neural n-gram can be Alternative to Self-attention
论文作者
论文摘要
变压器的令人印象深刻的性能归因于自我注意力,在每个位置都考虑了整个输入之间的依赖性。在这项工作中,我们改革了神经$ n $ gram模型,该模型仅着眼于每个位置的几个周围表示,其多头机制如Vaswani等人(2017年)。通过对序列到序列任务的实验,我们表明,用多头神经$ n $ gram在变压器中替换自我注意力可以比变压器实现可比或更好的性能。从对我们提出的方法的各种分析中,我们发现多头神经$ n $ gram是互补的,它们的组合可以进一步提高香草变压器的性能。
Impressive performance of Transformer has been attributed to self-attention, where dependencies between entire input in a sequence are considered at every position. In this work, we reform the neural $n$-gram model, which focuses on only several surrounding representations of each position, with the multi-head mechanism as in Vaswani et al.(2017). Through experiments on sequence-to-sequence tasks, we show that replacing self-attention in Transformer with multi-head neural $n$-gram can achieve comparable or better performance than Transformer. From various analyses on our proposed method, we find that multi-head neural $n$-gram is complementary to self-attention, and their combinations can further improve performance of vanilla Transformer.