论文标题

关于在深度图神经网络中过度平滑和过度方面之间的权衡

On the Trade-off between Over-smoothing and Over-squashing in Deep Graph Neural Networks

论文作者

Giraldo, Jhony H., Skianis, Konstantinos, Bouwmans, Thierry, Malliaros, Fragkiskos D.

论文摘要

图形神经网络(GNNS)在各种计算机科学应用中都取得了成功,但尽管在其他领域取得了深厚的学习成功,但Deep GNN的表现不佳。在堆叠图形卷积层,阻碍了远处节点的深层表示学习和信息传播时,过度平滑和过度方面是关键挑战。我们的工作表明,过度平滑和过度划分与图形拉普拉斯主义者的光谱差距本质上相关,从而在这两个问题之间不可避免地进行了折衷,因为它们不能同时缓解。为了实现合适的妥协,我们建议将边缘作为可行的方法添加和去除。我们介绍了随机的JOST和LIU曲率重新布线(SJLR)算法,该算法在计算上是有效的,与以前的基于曲率的方法相比,它具有基本属性。与现有方法不同,SJLR在GNN训练期间执行边缘添加和去除,同时保持测试过程中的图未改变。全面的比较表明,SJLR在解决过度平滑和过度阵型方面的竞争表现。

Graph Neural Networks (GNNs) have succeeded in various computer science applications, yet deep GNNs underperform their shallow counterparts despite deep learning's success in other domains. Over-smoothing and over-squashing are key challenges when stacking graph convolutional layers, hindering deep representation learning and information propagation from distant nodes. Our work reveals that over-smoothing and over-squashing are intrinsically related to the spectral gap of the graph Laplacian, resulting in an inevitable trade-off between these two issues, as they cannot be alleviated simultaneously. To achieve a suitable compromise, we propose adding and removing edges as a viable approach. We introduce the Stochastic Jost and Liu Curvature Rewiring (SJLR) algorithm, which is computationally efficient and preserves fundamental properties compared to previous curvature-based methods. Unlike existing approaches, SJLR performs edge addition and removal during GNN training while maintaining the graph unchanged during testing. Comprehensive comparisons demonstrate SJLR's competitive performance in addressing over-smoothing and over-squashing.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源