论文标题

全球关注的Persutohedral-GCN:图形卷积网络

Permutohedral-GCN: Graph Convolutional Networks with Global Attention

论文作者

Mostafa, Hesham, Nassar, Marcel

论文摘要

图形卷积网络(GCNS)通过汇总图形中邻居的特征来更新节点的特征向量。这忽略了遥远节点的潜在有用的贡献。由于可伸缩性问题(太多的节点可能会贡献)和过度厚度(从太多节点汇总的特征风险淹没相关信息并可能导致节点具有不同的标签但无法区分的功能),因此确定这种有用的遥远贡献是具有挑战性的。我们介绍了一个全局注意机制,该节点可以在图中选择性地参与并汇总功能。注意系数取决于可学习的节点嵌入之间的欧几里得距离,我们表明,所得的基于注意力的全局聚合方案类似于高维高斯滤波。这使得使用有效的近似高斯过滤技术可以实现我们的基于注意力的全球聚合方案。通过使用基于置换晶格的近似滤波方法,我们提出的全局聚合方案的时间复杂性仅随节点数量而线性增长。我们称之为定位的GCN是可替代的,端到端训练的,它们在几个节点分类基准上实现了最先进的性能。

Graph convolutional networks (GCNs) update a node's feature vector by aggregating features from its neighbors in the graph. This ignores potentially useful contributions from distant nodes. Identifying such useful distant contributions is challenging due to scalability issues (too many nodes can potentially contribute) and oversmoothing (aggregating features from too many nodes risks swamping out relevant information and may result in nodes having different labels but indistinguishable features). We introduce a global attention mechanism where a node can selectively attend to, and aggregate features from, any other node in the graph. The attention coefficients depend on the Euclidean distance between learnable node embeddings, and we show that the resulting attention-based global aggregation scheme is analogous to high-dimensional Gaussian filtering. This makes it possible to use efficient approximate Gaussian filtering techniques to implement our attention-based global aggregation scheme. By employing an approximate filtering method based on the permutohedral lattice, the time complexity of our proposed global aggregation scheme only grows linearly with the number of nodes. The resulting GCNs, which we term permutohedral-GCNs, are differentiable and trained end-to-end, and they achieve state of the art performance on several node classification benchmarks.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源