论文标题
DOMBERT:基于方面情感分析的面向域的语言模型
DomBERT: Domain-oriented Language Model for Aspect-based Sentiment Analysis
论文作者
论文摘要
本文着重于学习由最终任务驱动的面向域的语言模型,该模型旨在结合通用语言模型(例如Elmo和Bert)和特定领域的语言理解的世界。我们提出了Dombert,这是BERT的扩展,旨在向内域语料库和相关领域语料库学习。这有助于学习低资源的域语言模型。实验是针对基于方面的情感分析中的各种任务进行的,证明了有希望的结果。
This paper focuses on learning domain-oriented language models driven by end tasks, which aims to combine the worlds of both general-purpose language models (such as ELMo and BERT) and domain-specific language understanding. We propose DomBERT, an extension of BERT to learn from both in-domain corpus and relevant domain corpora. This helps in learning domain language models with low-resources. Experiments are conducted on an assortment of tasks in aspect-based sentiment analysis, demonstrating promising results.