论文标题

多跳注意图神经网络

Multi-hop Attention Graph Neural Network

论文作者

Wang, Guangtao, Ying, Rex, Huang, Jing, Leskovec, Jure

论文摘要

图形神经网络(GNN)中的自我发挥机制导致许多图表表示任务上的最新性能。当前,在每一层中,都在连接的节点对之间计算注意力,并且仅取决于两个节点的表示。但是,这种注意机制并不是没有直接连接而是提供重要网络环境的节点。在这里,我们提出了多跳的注意力图神经网络(Magna),这是一种将多跳上下文信息纳入各个注意力计算的原则方法。 Magna扩散了整个网络的注意力评分,这增加了GNN的每一层的接受场。与以前的方法不同,Magna在注意值上使用了扩散的先验,以有效说明这对断开的节点之间的所有路径。我们在理论和实验中证明了Magna在每一层中捕获了大规模的结构信息,并且具有低通效应,从而消除了从图数据中消除嘈杂的高频信息。关于节点分类以及知识图完成基准的实验结果表明,Magna实现了最新的结果:Magna在Cora,Citeseer和PubMed上的先前最新艺术品中,Magna的相对误差降低高达5.7%。 Magna还可以在大型开放图基准数据集中获得最佳性能。在知识图上完成WN18RR和FB15K-237上的最先进的Magna在四个不同的性能指标上进步。

Self-attention mechanism in graph neural networks (GNNs) led to state-of-the-art performance on many graph representation learning tasks. Currently, at every layer, attention is computed between connected pairs of nodes and depends solely on the representation of the two nodes. However, such attention mechanism does not account for nodes that are not directly connected but provide important network context. Here we propose Multi-hop Attention Graph Neural Network (MAGNA), a principled way to incorporate multi-hop context information into every layer of attention computation. MAGNA diffuses the attention scores across the network, which increases the receptive field for every layer of the GNN. Unlike previous approaches, MAGNA uses a diffusion prior on attention values, to efficiently account for all paths between the pair of disconnected nodes. We demonstrate in theory and experiments that MAGNA captures large-scale structural information in every layer, and has a low-pass effect that eliminates noisy high-frequency information from graph data. Experimental results on node classification as well as the knowledge graph completion benchmarks show that MAGNA achieves state-of-the-art results: MAGNA achieves up to 5.7 percent relative error reduction over the previous state-of-the-art on Cora, Citeseer, and Pubmed. MAGNA also obtains the best performance on a large-scale Open Graph Benchmark dataset. On knowledge graph completion MAGNA advances state-of-the-art on WN18RR and FB15k-237 across four different performance metrics.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源