论文标题
带有连接拉普拉斯人的捆捆神经网络
Sheaf Neural Networks with Connection Laplacians
论文作者
论文摘要
捆包神经网络(SNN)是一种在捆上运行的图形神经网络(GNN),该对象是将图形在其节点和边缘和边缘上的矢量空间和这些空间之间的线性图上的矢量空间。 SNN已被证明具有有用的理论属性,可以帮助解决异质和过度平滑的问题。这些模型固有的一种并发症是找到解决任务的良好结合。先前的作品提出了两种截然相反的方法:基于域知识手动构建捆扎,并使用基于梯度的方法端对端学习捆绑。但是,域知识通常不足,而学习捆绑可能会导致过度拟合和重要的计算开销。在这项工作中,我们提出了一种计算带动束带的新型方法,它从Riemannian几何形状中汲取灵感:我们利用了歧管假设来计算流形和图形感知的正交图,从而最佳地对齐相邻数据点的切线空间。我们表明,与以前的SNN模型相比,这种方法的计算开销较少。总体而言,这项工作提供了代数拓扑结构与差异几何形状之间的有趣联系,我们希望它将以这个方向激发未来的研究。
A Sheaf Neural Network (SNN) is a type of Graph Neural Network (GNN) that operates on a sheaf, an object that equips a graph with vector spaces over its nodes and edges and linear maps between these spaces. SNNs have been shown to have useful theoretical properties that help tackle issues arising from heterophily and over-smoothing. One complication intrinsic to these models is finding a good sheaf for the task to be solved. Previous works proposed two diametrically opposed approaches: manually constructing the sheaf based on domain knowledge and learning the sheaf end-to-end using gradient-based methods. However, domain knowledge is often insufficient, while learning a sheaf could lead to overfitting and significant computational overhead. In this work, we propose a novel way of computing sheaves drawing inspiration from Riemannian geometry: we leverage the manifold assumption to compute manifold-and-graph-aware orthogonal maps, which optimally align the tangent spaces of neighbouring data points. We show that this approach achieves promising results with less computational overhead when compared to previous SNN models. Overall, this work provides an interesting connection between algebraic topology and differential geometry, and we hope that it will spark future research in this direction.