论文标题
防止超图神经网络过度光滑
Preventing Over-Smoothing for Hypergraph Neural Networks
论文作者
论文摘要
近年来,HyperGraph Learning由于代表复杂和高级关系的能力而引起了极大的关注。但是,当前为超图设计的当前神经网络方法主要是浅层,因此限制了它们从高阶邻居中提取信息的能力。在本文中,我们从理论和经验上都表明,随着层数的增加,超图神经网络的性能不会改善,这被称为过度平滑的问题。为了避免此问题,我们开发了一个称为Deep-HGCN的新的深层卷积网络,该网络可以维持深层中节点表示的异质性。具体来说,我们证明A $ k $ layer Deep-HGCN模拟具有任意系数的订单$ K $的多项式过滤器,这可以缓解过度平滑的问题。各种数据集的实验结果表明,与最先进的HyperGraph学习方法相比,所提出的模型的出色性能。
In recent years, hypergraph learning has attracted great attention due to its capacity in representing complex and high-order relationships. However, current neural network approaches designed for hypergraphs are mostly shallow, thus limiting their ability to extract information from high-order neighbors. In this paper, we show both theoretically and empirically, that the performance of hypergraph neural networks does not improve as the number of layers increases, which is known as the over-smoothing problem. To avoid this issue, we develop a new deep hypergraph convolutional network called Deep-HGCN, which can maintain the heterogeneity of node representation in deep layers. Specifically, we prove that a $k$-layer Deep-HGCN simulates a polynomial filter of order $k$ with arbitrary coefficients, which can relieve the problem of over-smoothing. Experimental results on various datasets demonstrate the superior performance of the proposed model compared to the state-of-the-art hypergraph learning approaches.