论文标题
关于自然语言处理中转移学习的调查
A Survey on Transfer Learning in Natural Language Processing
论文作者
论文摘要
深度学习模型通常需要大量数据。但是,这些大型数据集并非总是可以实现的。这在许多具有挑战性的NLP任务中很常见。例如,考虑神经机器的翻译,例如,对于低资源语言而言,策划如此大的数据集可能是不可能的。深度学习模型的另一个局限性是对庞大的计算资源的需求。这些障碍激励研究质疑使用大型训练模型的知识转移的可能性。随着许多大型模型的出现,对转移学习的需求正在增加。在这项调查中,我们采用了NLP领域的最新转移学习进步。我们还提供了分类法,以分类文献中不同的转移学习方法。
Deep learning models usually require a huge amount of data. However, these large datasets are not always attainable. This is common in many challenging NLP tasks. Consider Neural Machine Translation, for instance, where curating such large datasets may not be possible specially for low resource languages. Another limitation of deep learning models is the demand for huge computing resources. These obstacles motivate research to question the possibility of knowledge transfer using large trained models. The demand for transfer learning is increasing as many large models are emerging. In this survey, we feature the recent transfer learning advances in the field of NLP. We also provide a taxonomy for categorizing different transfer learning approaches from the literature.