论文标题
班级学习和整流的特征图保护
Class-incremental Learning with Rectified Feature-Graph Preservation
论文作者
论文摘要
在本文中,我们用一个人解决了基于蒸馏的课堂学习的问题。该任务的一个核心主题是学习随着时间的流逝,以顺序阶段到达的新课程,同时保持模型识别可见类的能力,仅记忆有限地保存可见的数据样本。已经提出了许多正规化策略来减轻灾难性遗忘的现象。为了更好地理解这些正常化的本质,我们介绍了一个特征保护的观点。对他们的优点和缺点的洞察力激发了我们加权 - 欧国人正规化以保存旧知识。我们进一步提出了整洁的余弦归一化,并显示如何与二进制跨透明拷贝一起使用,以增加班级分离以有效学习新类别。 CIFAR-100和ImageNet数据集的实验结果表明,我们的方法在减少分类错误,缓解灾难性遗忘并鼓励在不同类别上均衡准确性方面的最新方法优于最先进的方法。我们的项目页面是:https://github.com/yhchen12101/fgp-icl。
In this paper, we address the problem of distillation-based class-incremental learning with a single head. A central theme of this task is to learn new classes that arrive in sequential phases over time while keeping the model's capability of recognizing seen classes with only limited memory for preserving seen data samples. Many regularization strategies have been proposed to mitigate the phenomenon of catastrophic forgetting. To understand better the essence of these regularizations, we introduce a feature-graph preservation perspective. Insights into their merits and faults motivate our weighted-Euclidean regularization for old knowledge preservation. We further propose rectified cosine normalization and show how it can work with binary cross-entropy to increase class separation for effective learning of new classes. Experimental results on both CIFAR-100 and ImageNet datasets demonstrate that our method outperforms the state-of-the-art approaches in reducing classification error, easing catastrophic forgetting, and encouraging evenly balanced accuracy over different classes. Our project page is at : https://github.com/yhchen12101/FGP-ICL.