论文标题

FastGCL:通过对比邻域聚集在图表上快速自我监督学习

FastGCL: Fast Self-Supervised Learning on Graphs via Contrastive Neighborhood Aggregation

论文作者

Wang, Yuansheng, Sun, Wangbin, Xu, Kun, Zhu, Zulun, Chen, Liang, Zheng, Zibin

论文摘要

图形对比学习(GCL)是一种流行的图形自我监督学习方法,最近实现了不可忽略的效果。为了实现卓越的性能,大多数现有的GCL方法详细介绍了图数据增强,以构建适当的对比对。但是,现有方法更加重视复杂的图形数据增强,这需要额外的时间开销,并且更少注意开发特定于编码器特征的对比方案。我们认为,应根据图神经网络的特征(例如邻域聚集)量身定制更好的对比度方案,并提出一种名为fastGCL的简单而有效的方法。具体而言,通过将加权聚集和非聚集的邻域信息构建为正和负样本,FastGCL可以识别数据的潜在语义信息,而不会干扰图形拓扑和节点属性,从而导致更快的训练和收敛速度。与现有最新方法相比,FASTGCL具有竞争性的分类性能和显着的训练速度,已经进行了广泛的实验,表明FASTGCL具有竞争性的分类性能和显着的训练速度。

Graph contrastive learning (GCL), as a popular approach to graph self-supervised learning, has recently achieved a non-negligible effect. To achieve superior performance, the majority of existing GCL methods elaborate on graph data augmentation to construct appropriate contrastive pairs. However, existing methods place more emphasis on the complex graph data augmentation which requires extra time overhead, and pay less attention to developing contrastive schemes specific to encoder characteristics. We argue that a better contrastive scheme should be tailored to the characteristics of graph neural networks (e.g., neighborhood aggregation) and propose a simple yet effective method named FastGCL. Specifically, by constructing weighted-aggregated and non-aggregated neighborhood information as positive and negative samples respectively, FastGCL identifies the potential semantic information of data without disturbing the graph topology and node attributes, resulting in faster training and convergence speeds. Extensive experiments have been conducted on node classification and graph classification tasks, showing that FastGCL has competitive classification performance and significant training speedup compared to existing state-of-the-art methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源