论文标题
通过高效提示进行多语言关系分类
Multilingual Relation Classification via Efficient and Effective Prompting
论文作者
论文摘要
提示预训练的语言模型在各种NLP任务上取得了令人印象深刻的表现,尤其是在低数据制度中。尽管在单语言设置中提示成功,但由于手工制作多语言提示的高成本,在多语言方案中应用基于及时的方法一直限于一组狭窄的任务。在本文中,我们通过引入一种有效的方法来介绍基于及时的多语言关系分类(RC)的第一项工作,该方法从关系三元组中构造提示,并且仅涉及类标签的最小翻译。我们在完全有监督的,很少的射击和零拍摄方案中评估了其性能,并在交叉语言环境中分析了14种语言,及时的变体和英语任务培训的有效性。我们发现,在充分监督和少数场景中,我们的及时方法都击败了竞争基准:微调XLM-R_EM和NULL提示。它还通过零拍实验中的较大边缘胜过随机基线。我们的方法几乎不需要语言知识,可以用作类似多语言分类任务的强大基线。
Prompting pre-trained language models has achieved impressive performance on various NLP tasks, especially in low data regimes. Despite the success of prompting in monolingual settings, applying prompt-based methods in multilingual scenarios has been limited to a narrow set of tasks, due to the high cost of handcrafting multilingual prompts. In this paper, we present the first work on prompt-based multilingual relation classification (RC), by introducing an efficient and effective method that constructs prompts from relation triples and involves only minimal translation for the class labels. We evaluate its performance in fully supervised, few-shot and zero-shot scenarios, and analyze its effectiveness across 14 languages, prompt variants, and English-task training in cross-lingual settings. We find that in both fully supervised and few-shot scenarios, our prompt method beats competitive baselines: fine-tuning XLM-R_EM and null prompts. It also outperforms the random baseline by a large margin in zero-shot experiments. Our method requires little in-language knowledge and can be used as a strong baseline for similar multilingual classification tasks.