论文标题

ECOLA:增强的时间知识嵌入具有上下文化语言表示

ECOLA: Enhanced Temporal Knowledge Embeddings with Contextualized Language Representations

论文作者

Han, Zhen, Liao, Ruotong, Gu, Jindong, Zhang, Yao, Ding, Zifeng, Gu, Yujia, Köppl, Heinz, Schütze, Hinrich, Tresp, Volker

论文摘要

由于传统的知识嵌入模型无法充分利用丰富的文本信息,因此已经进行了广泛的研究工作,以增强使用文本嵌入的知识。但是,现有的增强方法不能适用于时间知识图(TKG),后者包含具有复杂时间动力学的时间依赖的事件知识。具体而言,现有的增强方法通常认为知识嵌入是与时间无关的。相比之下,嵌入在TKG模型中的实体通常会发展,这构成了将时间相关文本与实体保持一致的挑战。为此,我们建议在本文中研究使用文本数据嵌入的增强时间知识。作为解决此任务的一种方法,我们提出了具有上下文化语言表示(ECOLA)的增强的时间知识嵌入,该嵌入了时间方面,并将文本信息注入时间知识嵌入。为了评估ECOLA,我们介绍了三个新数据集,用于培训和评估ECOLA。广泛的实验表明,ECOLA显着增强了时间KG嵌入模型,在链接预测任务上@1的相对相对改进高达287%。代码和模型可在https://anonymon.4open.science/r/ecola上公开使用。

Since conventional knowledge embedding models cannot take full advantage of the abundant textual information, there have been extensive research efforts in enhancing knowledge embedding using texts. However, existing enhancement approaches cannot apply to temporal knowledge graphs (tKGs), which contain time-dependent event knowledge with complex temporal dynamics. Specifically, existing enhancement approaches often assume knowledge embedding is time-independent. In contrast, the entity embedding in tKG models usually evolves, which poses the challenge of aligning temporally relevant texts with entities. To this end, we propose to study enhancing temporal knowledge embedding with textual data in this paper. As an approach to this task, we propose Enhanced Temporal Knowledge Embeddings with Contextualized Language Representations (ECOLA), which takes the temporal aspect into account and injects textual information into temporal knowledge embedding. To evaluate ECOLA, we introduce three new datasets for training and evaluating ECOLA. Extensive experiments show that ECOLA significantly enhances temporal KG embedding models with up to 287% relative improvements regarding Hits@1 on the link prediction task. The code and models are publicly available on https://anonymous.4open.science/r/ECOLA.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源