论文标题

GNN转换框架以提高效率和可伸缩性

GNN Transformation Framework for Improving Efficiency and Scalability

论文作者

Maekawa, Seiji, Sasaki, Yuya, Fletcher, George, Onizuka, Makoto

论文摘要

我们提出了一个框架,该框架会自动将不可估计的GNN转换为基于预典型的GNN,该GNN对于大型图表有效且可扩展。我们框架的优势是两个方面。 1)它通过将局部特征聚合从其图形卷积中的重量学习分开,将各种不可缩放的GNN转换为大规模图表,2)它通过将其边缘分解为小型差异和平衡的集合而有效地在GPU上对GPU进行了预先执行。通过大规模图的广泛实验,我们证明了转化的GNN在训练时间内的运行速度比现有GNN的速度快,同时实现了最先进的GNN的竞争精度。因此,我们的转型框架为可扩展GNN的未来研究提供了简单有效的基准。

We propose a framework that automatically transforms non-scalable GNNs into precomputation-based GNNs which are efficient and scalable for large-scale graphs. The advantages of our framework are two-fold; 1) it transforms various non-scalable GNNs to scale well to large-scale graphs by separating local feature aggregation from weight learning in their graph convolution, 2) it efficiently executes precomputation on GPU for large-scale graphs by decomposing their edges into small disjoint and balanced sets. Through extensive experiments with large-scale graphs, we demonstrate that the transformed GNNs run faster in training time than existing GNNs while achieving competitive accuracy to the state-of-the-art GNNs. Consequently, our transformation framework provides simple and efficient baselines for future research on scalable GNNs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源