论文标题
神经随机块模型和基于社区的图形学习
Neural Stochastic Block Model & Scalable Community-Based Graph Learning
论文作者
论文摘要
本文提出了一个新型的基于社区的基于社区的神经框架,用于图形学习。该框架通过社区检测的任务来学习图形拓扑,并通过使用我们提出的联合SBM损耗函数进行优化,这是由于经典随机块模型(SBM)的可能性功能的非平地适应而产生的。与SBM相比,我们的框架是灵活的,自然允许柔软的标签和复杂节点属性的消化。主要目标是对复杂图数据的有效估值,因此我们的设计旨在旨在适应大数据,并确保有一个直接通行证进行有效评估。对于大图,它仍然是一个开放的问题,即如何有效利用其基础结构来完成各种图形学习任务。以前可能是沉重的工作。借助我们基于社区的框架,这将变得不那么困难,并且使任务模型基本上可以插入和执行联合培训。目前,我们研究了两个特定的应用程序:图形对齐和异常相关检测,并讨论如何利用我们的框架解决这两个问题。进行了广泛的实验以证明我们方法的有效性。我们还贡献了经典技术的调整,我们发现对性能和可扩展性很有帮助。例如,1)GAT+是GAT的改进设计(图表网络),尺度cosine的相似性以及基于卷积/注意力的统一实现和基于随机步行的神经图模型。
This paper proposes a novel scalable community-based neural framework for graph learning. The framework learns the graph topology through the task of community detection and link prediction by optimizing with our proposed joint SBM loss function, which results from a non-trivial adaptation of the likelihood function of the classic Stochastic Block Model (SBM). Compared with SBM, our framework is flexible, naturally allows soft labels and digestion of complex node attributes. The main goal is efficient valuation of complex graph data, therefore our design carefully aims at accommodating large data, and ensures there is a single forward pass for efficient evaluation. For large graph, it remains an open problem of how to efficiently leverage its underlying structure for various graph learning tasks. Previously it can be heavy work. With our community-based framework, this becomes less difficult and allows the task models to basically plug-in-and-play and perform joint training. We currently look into two particular applications, the graph alignment and the anomalous correlation detection, and discuss how to make use of our framework to tackle both problems. Extensive experiments are conducted to demonstrate the effectiveness of our approach. We also contributed tweaks of classic techniques which we find helpful for performance and scalability. For example, 1) the GAT+, an improved design of GAT (Graph Attention Network), the scaled-cosine similarity, and a unified implementation of the convolution/attention based and the random-walk based neural graph models.