论文标题

用于机器阅读理解的回顾读者

Retrospective Reader for Machine Reading Comprehension

论文作者

Zhang, Zhuosheng, Yang, Junjie, Zhao, Hai

论文摘要

机器阅读理解(MRC)是一个AI挑战,需要机器根据给定段落确定问题的正确答案。 MRC系统不仅必须在必要时回答问题,而且还必须区分何时根据给定段落没有答案,然后巧妙地放弃回答。当MRC任务中涉及无法回答的问题时,除编码器外,特别需要一个称为验证者的基本验证模块,尽管关于MRC建模的最新实践仍然只能通过专注于“阅读”来采用预先培训的语言模型作为编码器块,但仍可获得最大的好处。本文将自己致力于探索MRC任务的更好的验证器设计,而无法回答的问题。受到人类如何解决阅读理解问题的启发,我们提出了一个回顾性的读者(复古阅读器),该读者集成了两个阅读和验证策略的两个阶段:1)简短研究通过和问题的整体相互作用,并产生初始判断; 2)验证答案并给出最终预测的密集阅读。在两个基准的MRC挑战数据集2.0和NewsQA上评估了拟议的读者,并获得了新的最新结果。显着性测试表明,我们的模型明显好于强元和阿尔伯特基线。还进行了一系列分析来解释拟议读者的有效性。

Machine reading comprehension (MRC) is an AI challenge that requires machine to determine the correct answers to questions based on a given passage. MRC systems must not only answer question when necessary but also distinguish when no answer is available according to the given passage and then tactfully abstain from answering. When unanswerable questions are involved in the MRC task, an essential verification module called verifier is especially required in addition to the encoder, though the latest practice on MRC modeling still most benefits from adopting well pre-trained language models as the encoder block by only focusing on the "reading". This paper devotes itself to exploring better verifier design for the MRC task with unanswerable questions. Inspired by how humans solve reading comprehension questions, we proposed a retrospective reader (Retro-Reader) that integrates two stages of reading and verification strategies: 1) sketchy reading that briefly investigates the overall interactions of passage and question, and yield an initial judgment; 2) intensive reading that verifies the answer and gives the final prediction. The proposed reader is evaluated on two benchmark MRC challenge datasets SQuAD2.0 and NewsQA, achieving new state-of-the-art results. Significance tests show that our model is significantly better than the strong ELECTRA and ALBERT baselines. A series of analysis is also conducted to interpret the effectiveness of the proposed reader.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源