论文标题
多任务的多任务学习以及多选择性阅读理解
Multi-task Learning with Multi-head Attention for Multi-choice Reading Comprehension
论文作者
论文摘要
多项选择机器阅读理解(MRC)是一项重要且具有挑战性的自然语言理解(NLU)任务,其中机器必须从一组选择中选择问题的答案,并将问题放在文本段落或对话框的上下文中。在过去的几年中,NLU领域通过基于变压器体系结构的模型出现进行了革新,该模型的出现是根据大量无监督的数据鉴定的,然后对各种监督的学习NLU任务进行了微调。变压器模型已在NLU领域占据了各种各样的领导板。在MRC领域,梦想数据集上的当前最新模型(请参阅[Sunet al。,2019])微调Albert是一个大型基于变压器的大型预测的模型,并将其与上下文和问题之间的多头注意力相结合,并在“ Quest-Al”之间[Zhuet al。,2020年的目的]进行了逐步进行,这是一个新的状态。对两个MRC多选择阅读理解任务(种族和梦想)的多任务学习。
Multiple-choice Machine Reading Comprehension (MRC) is an important and challenging Natural Language Understanding (NLU) task, in which a machine must choose the answer to a question from a set of choices, with the question placed in context of text passages or dialog. In the last a couple of years the NLU field has been revolutionized with the advent of models based on the Transformer architecture, which are pretrained on massive amounts of unsupervised data and then fine-tuned for various supervised learning NLU tasks. Transformer models have come to dominate a wide variety of leader-boards in the NLU field; in the area of MRC, the current state-of-the-art model on the DREAM dataset (see[Sunet al., 2019]) fine tunes Albert, a large pretrained Transformer-based model, and addition-ally combines it with an extra layer of multi-head attention between context and question-answer[Zhuet al., 2020].The purpose of this note is to document a new state-of-the-art result in the DREAM task, which is accomplished by, additionally, performing multi-task learning on two MRC multi-choice reading comprehension tasks (RACE and DREAM).