论文标题

双重对比学习:通过标签吸引数据扩展的文本分类

Dual Contrastive Learning: Text Classification via Label-Aware Data Augmentation

论文作者

Chen, Qianben, Zhang, Richong, Zheng, Yaowei, Mao, Yongyi

论文摘要

对比学习通过在无监督的环境中通过自学意义进行表示,在表示学习方面取得了巨大的成功。但是,在实践中,有效地将对比度学习调整为监督学习任务仍然是一个挑战。在这项工作中,我们引入了双重对比学习(DUALCL)框架,该框架同时学习了同一空间中输入样本的特征和分类器的参数。具体而言,DualCl将分类器的参数视为与不同标签相关的增强样品,然后利用输入样品和增强样品之间的对比度学习。对五个基准文本分类数据集及其低资源版本的实证研究证明了分类准确性的提高,并确认了学习DUALCL的歧视性表示的能力。

Contrastive learning has achieved remarkable success in representation learning via self-supervision in unsupervised settings. However, effectively adapting contrastive learning to supervised learning tasks remains as a challenge in practice. In this work, we introduce a dual contrastive learning (DualCL) framework that simultaneously learns the features of input samples and the parameters of classifiers in the same space. Specifically, DualCL regards the parameters of the classifiers as augmented samples associating to different labels and then exploits the contrastive learning between the input samples and the augmented samples. Empirical studies on five benchmark text classification datasets and their low-resource version demonstrate the improvement in classification accuracy and confirm the capability of learning discriminative representations of DualCL.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源