论文标题

知识增强BERT共同网络在多转向对话中

Knowledge Augmented BERT Mutual Network in Multi-turn Spoken Dialogues

论文作者

Wu, Ting-Wei, Juang, Biing-Hwang

论文摘要

现代的口语理解(SLU)系统依赖于单一语音揭示的复杂语义概念来检测意图和插槽。但是,他们缺乏在对话中特别是在长期插槽环境中对多转弯动态进行建模的能力。没有外部知识,根据单词序列中的语言合法性有限,可能会忽略对话转弯的深度语义信息。在本文中,我们建议将基于BERT的联合模型配备一个知识注意模块,以相互利用两个SLU任务之间的对话环境。门控机制进一步用于滤除不相关的知识三元,并避免分心理解力。两个复杂的多转化对话数据集中的实验结果通过对两个SLU任务进行过多的知识和对话环境进行建模,与几个竞争基线相比,我们的方法有了很大的改进。

Modern spoken language understanding (SLU) systems rely on sophisticated semantic notions revealed in single utterances to detect intents and slots. However, they lack the capability of modeling multi-turn dynamics within a dialogue particularly in long-term slot contexts. Without external knowledge, depending on limited linguistic legitimacy within a word sequence may overlook deep semantic information across dialogue turns. In this paper, we propose to equip a BERT-based joint model with a knowledge attention module to mutually leverage dialogue contexts between two SLU tasks. A gating mechanism is further utilized to filter out irrelevant knowledge triples and to circumvent distracting comprehension. Experimental results in two complicated multi-turn dialogue datasets have demonstrate by mutually modeling two SLU tasks with filtered knowledge and dialogue contexts, our approach has considerable improvements compared with several competitive baselines.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源