论文标题
MZET:内存增强零射击细粒命名实体打字
MZET: Memory Augmented Zero-Shot Fine-grained Named Entity Typing
论文作者
论文摘要
命名实体键入(NET)是一项分类任务,在给定语义类型的上下文中分配实体提及。但是,随着实体类型的规模和颗粒状的增长,在先前对新出现的实体类型的关注时,罕见的研究。在本文中,我们提出了Mzet,这是一种新型的内存增强FNET(细粒网)模型,以零拍的方式应对看不见的类型。 Mzet结合了字符级别,文字级别和上下文级别的信息,以了解实体提及表示形式。此外,Mzet将语义含义和层次结构视为实体类型表示。最后,通过模拟实体提及与实体类型之间关系的内存组件,mzet将知识从可见的实体类型转移到零摄影。在三个公共数据集上进行的广泛实验表明,Mzet获得了突出的性能,该绩效超过了最新的FNET神经网络模型,其Micro-F1和Macro-F1得分高达7%。
Named entity typing (NET) is a classification task of assigning an entity mention in the context with given semantic types. However, with the growing size and granularity of the entity types, rare researches in previous concern with newly emerged entity types. In this paper, we propose MZET, a novel memory augmented FNET (Fine-grained NET) model, to tackle the unseen types in a zero-shot manner. MZET incorporates character-level, word-level, and contextural-level information to learn the entity mention representation. Besides, MZET considers the semantic meaning and the hierarchical structure into the entity type representation. Finally, through the memory component which models the relationship between the entity mention and the entity type, MZET transfer the knowledge from seen entity types to the zero-shot ones. Extensive experiments on three public datasets show prominent performance obtained by MZET, which surpasses the state-of-the-art FNET neural network models with up to 7% gain in Micro-F1 and Macro-F1 score.