论文标题

奇异价值微调:很少的分割需要很少的参数微调

Singular Value Fine-tuning: Few-shot Segmentation requires Few-parameters Fine-tuning

论文作者

Sun, Yanpeng, Chen, Qiang, He, Xiangyu, Wang, Jian, Feng, Haocheng, Han, Junyu, Ding, Errui, Cheng, Jian, Li, Zechao, Wang, Jingdong

论文摘要

冻结预训练的主链已成为标准范式,以避免在几次分段中过度拟合。在本文中,我们重新考虑范式并探索一个新的制度:{\ em微调骨架中的一小部分参数}。我们提出了一种解决过度拟合问题的解决方案,从而可以更好地模型对学习新颖的课程进行更好的概括。我们的方法通过奇异值分解(SVD)将主链参数分解为三个连续的矩阵,然后{\ em仅微调单数值}并保持其他冻结。上面的设计使模型可以在新型类别上调整特征表示,同时在预训练的主链中保持语义线索。我们在具有不同骨架的各种少数射门分割方法上评估了{\ em单数值微调(SVF)}方法。我们在1-Shot和5-Shot设置的Pascal-5 $^i $和Coco-20 $^i $上获得最先进的结果。希望这个简单的基准将鼓励研究人员重新考虑骨干微调在几次环境中的作用。源代码和模型将在https://github.com/syp2ysy/svf上找到。

Freezing the pre-trained backbone has become a standard paradigm to avoid overfitting in few-shot segmentation. In this paper, we rethink the paradigm and explore a new regime: {\em fine-tuning a small part of parameters in the backbone}. We present a solution to overcome the overfitting problem, leading to better model generalization on learning novel classes. Our method decomposes backbone parameters into three successive matrices via the Singular Value Decomposition (SVD), then {\em only fine-tunes the singular values} and keeps others frozen. The above design allows the model to adjust feature representations on novel classes while maintaining semantic clues within the pre-trained backbone. We evaluate our {\em Singular Value Fine-tuning (SVF)} approach on various few-shot segmentation methods with different backbones. We achieve state-of-the-art results on both Pascal-5$^i$ and COCO-20$^i$ across 1-shot and 5-shot settings. Hopefully, this simple baseline will encourage researchers to rethink the role of backbone fine-tuning in few-shot settings. The source code and models will be available at https://github.com/syp2ysy/SVF.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源