论文标题

从搜索无监督的句子摘要中学习非自动入学模型

Learning Non-Autoregressive Models from Search for Unsupervised Sentence Summarization

论文作者

Liu, Puyuan, Huang, Chenyang, Mou, Lili

论文摘要

文本摘要旨在为输入文本生成一个简短的摘要。在这项工作中,我们提出了一种非自助力无监督的摘要(NAUS)方法,该方法不需要并行数据进行培训。我们的Naus首先对启发式定义的分数进行基于编辑的搜索,并以伪园的形式产生摘要。然后,我们基于搜索结果训练仅编码非自动回旋变压器。我们还提出了一种用于长度控制解码的动态编程方法,这对于摘要任务很重要。两个数据集的实验表明,Naus实现了无监督摘要的最先进的性能,但在很大程度上提高了推理效率。此外,我们的算法能够执行明确的长度转移摘要生成。

Text summarization aims to generate a short summary for an input text. In this work, we propose a Non-Autoregressive Unsupervised Summarization (NAUS) approach, which does not require parallel data for training. Our NAUS first performs edit-based search towards a heuristically defined score, and generates a summary as pseudo-groundtruth. Then, we train an encoder-only non-autoregressive Transformer based on the search result. We also propose a dynamic programming approach for length-control decoding, which is important for the summarization task. Experiments on two datasets show that NAUS achieves state-of-the-art performance for unsupervised summarization, yet largely improving inference efficiency. Further, our algorithm is able to perform explicit length-transfer summary generation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源