论文标题

简单有效的异构图神经网络

Simple and Efficient Heterogeneous Graph Neural Network

论文作者

Yang, Xiaocheng, Yan, Mingyu, Pan, Shirui, Ye, Xiaochun, Fan, Dongrui

论文摘要

异质图神经网络(HGNN)具有将异质图的丰富结构和语义信息嵌入到节点表示中的强大能力。现有的HGNN在均匀图上从图形神经网络(GNN)继承了许多机制,尤其是注意机制和多层结构。这些机制带来了过度复杂性,但是很少研究它们在异质图上是否真的有效。本文对这些机制进行了深入而详细的研究,并提出了简单有效的异质图神经网络(SEHGNN)。为了轻松捕获结构信息,SEHGNN使用轻重量平均聚合器预先计算邻居的聚合,从而通过消除过度使用的邻居注意力并避免在每个训练时期中反复进行邻居聚集来降低复杂性。为了更好地利用语义信息,Sehgnn采用具有长元素的单层结构来扩展接受场,以及基于变压器的语义融合模块,从不同的Metapaths到融合功能。结果,SEHGNN表现出简单网络结构,高预测准确性和快速训练速度的特征。在五个现实世界的异质图上进行了广泛的实验,证明了Sehgnn在准确性和训练速度上的优越性。

Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations. Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure. These mechanisms bring excessive complexity, but seldom work studies whether they are really effective on heterogeneous graphs. This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN). To easily capture structural information, SeHGNN pre-computes the neighbor aggregation using a light-weight mean aggregator, which reduces complexity by removing overused neighbor attention and avoiding repeated neighbor aggregation in every training epoch. To better utilize semantic information, SeHGNN adopts the single-layer structure with long metapaths to extend the receptive field, as well as a transformer-based semantic fusion module to fuse features from different metapaths. As a result, SeHGNN exhibits the characteristics of simple network structure, high prediction accuracy, and fast training speed. Extensive experiments on five real-world heterogeneous graphs demonstrate the superiority of SeHGNN over the state-of-the-arts on both accuracy and training speed.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源