论文标题

积极的需要编码用于解释树建筑的积极作用

Active entailment encoding for explanation tree construction using parsimonious generation of hard negatives

论文作者

Bogatu, Alex, Zhou, Zili, Landers, Dónal, Freitas, André

论文摘要

已经提出了需要树木,以模拟在开放域的文本问题回答的背景下,解释产生的人类推理过程。但是,实际上,手动构建这些解释树是一个艰苦的过程,需要积极的人类参与。鉴于捕获从问题到答案的推理线的复杂性或从索赔到前提的问题,因此出现了如何帮助用户有效地构建多个级别的索引树的问题。在本文中,我们将需要树的构造作为一系列主动的前提选择步骤,即,对于解释树中的每个中间节点,专家需要从大候选人列表中注释前提事实的正面和负面示例。然后,我们在迭代上进行精细 - 训练前训练的变压器模型,并产生了正面和紧密控制的负面样本,并旨在平衡语义关系和解释性的关系关系的编码。实验评估证实了拟议的主动精细研究方法的可测量效率提高,以促进累积树的构建:与几种选择相比,解释性前提选择的提高了20 \%。

Entailment trees have been proposed to simulate the human reasoning process of explanation generation in the context of open--domain textual question answering. However, in practice, manually constructing these explanation trees proves a laborious process that requires active human involvement. Given the complexity of capturing the line of reasoning from question to the answer or from claim to premises, the issue arises of how to assist the user in efficiently constructing multi--level entailment trees given a large set of available facts. In this paper, we frame the construction of entailment trees as a sequence of active premise selection steps, i.e., for each intermediate node in an explanation tree, the expert needs to annotate positive and negative examples of premise facts from a large candidate list. We then iteratively fine--tune pre--trained Transformer models with the resulting positive and tightly controlled negative samples and aim to balance the encoding of semantic relationships and explanatory entailment relationships. Experimental evaluation confirms the measurable efficiency gains of the proposed active fine--tuning method in facilitating entailment trees construction: up to 20\% improvement in explanatory premise selection when compared against several alternatives.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源