论文标题
基于上下文的变压器模型用于答案句子选择
Context-based Transformer Models for Answer Sentence Selection
论文作者
论文摘要
问答系统设计的重要任务是选择包含(或构成)与问问题相关的文档的答案的句子。大多数以前的工作仅使用目标句子来计算其分数,因为模型不足以有效地编码其他上下文信息。在本文中,我们分析了上下文信息在句子选择任务中的作用,并提出了一种基于变压器的体系结构,该体系结构利用了两种类型的上下文,即本地和全局。前者描述了包含句子的段落,旨在求解隐式参考文献,而后者则描述了包含候选句子的整个文档,提供了基于内容的信息。三个不同基准的结果表明,变压器模型中局部和全球环境的组合显着提高了答案句子选择的准确性。
An important task for the design of Question Answering systems is the selection of the sentence containing (or constituting) the answer from documents relevant to the asked question. Most previous work has only used the target sentence to compute its score with the question as the models were not powerful enough to also effectively encode additional contextual information. In this paper, we analyze the role of the contextual information in the sentence selection task, proposing a Transformer based architecture that leverages two types of contexts, local and global. The former describes the paragraph containing the sentence, aiming at solving implicit references, whereas the latter describes the entire document containing the candidate sentence, providing content-based information. The results on three different benchmarks show that the combination of local and global contexts in a Transformer model significantly improves the accuracy in Answer Sentence Selection.