论文标题

通过图形拓扑的微调图神经网络引起的最佳传输

Fine-Tuning Graph Neural Networks via Graph Topology induced Optimal Transport

论文作者

Zhang, Jiying, Xiao, Xi, Huang, Long-Kai, Rong, Yu, Bian, Yatao

论文摘要

最近,由于其在许多真实世界应用中减轻了标签问题的能力,因此在图形学习社区中引起了很多关注。当前的研究使用从图像或文本数据得出的现有技术,例如权重约束,表示的约束,将不变知识从训练阶段转移到微调阶段。但是,这些方法未能从图形结构和图神经网络(GNN)样式模型中保存不变。在本文中,我们提出了一个新型的基于最佳运输的微型微调框架,称为GTOT调整,即GNN拓扑诱导GNN样式骨架的最佳运输微调。需要GTOT调整来利用图形数据的属性来增强微调网络产生的表示形式。为了实现这一目标,我们将图形局部知识转移作为最佳传输(OT)问题,并具有结构性先验,并构建GTOT正常器以限制微型模型行为。通过使用节点之间的邻接关系,GTOT正常化程序可以实现节点级的最佳传输程序并减少冗余传输程序,从而从预训练的模型中有效地传递了知识转移。我们评估了具有各种GNN骨架的八项下游任务的GTOT调整,并证明它可以为GNN实现最新的微调性能。

Recently, the pretrain-finetuning paradigm has attracted tons of attention in graph learning community due to its power of alleviating the lack of labels problem in many real-world applications. Current studies use existing techniques, such as weight constraint, representation constraint, which are derived from images or text data, to transfer the invariant knowledge from the pre-train stage to fine-tuning stage. However, these methods failed to preserve invariances from graph structure and Graph Neural Network (GNN) style models. In this paper, we present a novel optimal transport-based fine-tuning framework called GTOT-Tuning, namely, Graph Topology induced Optimal Transport fine-Tuning, for GNN style backbones. GTOT-Tuning is required to utilize the property of graph data to enhance the preservation of representation produced by fine-tuned networks. Toward this goal, we formulate graph local knowledge transfer as an Optimal Transport (OT) problem with a structural prior and construct the GTOT regularizer to constrain the fine-tuned model behaviors. By using the adjacency relationship amongst nodes, the GTOT regularizer achieves node-level optimal transport procedures and reduces redundant transport procedures, resulting in efficient knowledge transfer from the pre-trained models. We evaluate GTOT-Tuning on eight downstream tasks with various GNN backbones and demonstrate that it achieves state-of-the-art fine-tuning performance for GNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源