论文标题

超越图形神经网络具有解除的关系神经网络

Beyond Graph Neural Networks with Lifted Relational Neural Networks

论文作者

Sourek, Gustav, Zelezny, Filip, Kuzelka, Ondrej

论文摘要

我们根据建立的关系神经网络的语言演示了一个声明性的可区分编程框架,其中使用了小的参数化逻辑程序来编码关系学习方案。当提供关系数据(例如各种形式的图形)时,程序解释器会动态展开可通过标准均值进行程序参数优化的可区分计算图。从所使用的声明性数据抽象来看,与直接在计算图级别上运行的现有程序方法相比,这将导致紧凑而优雅的学习程序。我们说明了如何将该想法用于有效地编码各种现有的高级神经体系结构,特别关注图形神经网络(GNNS)。此外,我们展示了当代GNN模型如何很容易扩展到更高的关系表现力。在实验中,我们通过与专门的GNN深度学习框架进行比较,展示了正确性和计算效率,同时阐明了现有GNN模型的学习性能。

We demonstrate a declarative differentiable programming framework based on the language of Lifted Relational Neural Networks, where small parameterized logic programs are used to encode relational learning scenarios. When presented with relational data, such as various forms of graphs, the program interpreter dynamically unfolds differentiable computational graphs to be used for the program parameter optimization by standard means. Following from the used declarative Datalog abstraction, this results into compact and elegant learning programs, in contrast with the existing procedural approaches operating directly on the computational graph level. We illustrate how this idea can be used for an efficient encoding of a diverse range of existing advanced neural architectures, with a particular focus on Graph Neural Networks (GNNs). Additionally, we show how the contemporary GNN models can be easily extended towards higher relational expressiveness. In the experiments, we demonstrate correctness and computation efficiency through comparison against specialized GNN deep learning frameworks, while shedding some light on the learning performance of existing GNN models.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源