论文标题

图形卷积神经网络具有基于节点过渡概率的消息传递和滴定正则化

Graph Convolutional Neural Networks with Node Transition Probability-based Message Passing and DropNode Regularization

论文作者

Do, Tien Huu, Nguyen, Duc Minh, Bekoulis, Giannis, Munteanu, Adrian, Deligiannis, Nikos

论文摘要

图形卷积神经网络(GCNN)最近因其处理图形结构数据的能力而受到了很多关注。在现有的GCNN中,可以将许多方法视为神经信息传递图案的实例。节点的特征在其邻居周围传递,汇总和转化以产生更好的节点的表示。然而,这些方法很少使用节点过渡概率,该方法已被发现在探索图表中有用。此外,当使用过渡概率时,在特征聚合步骤中通常会不正确地考虑它们的过渡方向,从而导致加权方案效率低下。此外,尽管已经引入了许多具有越来越复杂水平的GCNN模型,但在小图上接受训练时,GCNN通常会遭受过度拟合的苦难。 GCNNS的另一个问题是过度平滑,这往往使节点的表示无法区分。这项工作提出了一种新方法,通过正确考虑过渡方向来改善基于节点过渡概率的消息传递过程,与现有的对应物相比,在节点特征聚合中提供了更好的加权方案。此外,我们提出了一种新颖的正则化方法,称为DropNode,以同时解决过度拟合和过度平滑的问题。 DropNode随机丢弃图的一部分,因此创建了图的多个变形版本,从而导致数据增强正则化效果。另外,滴定节点降低了图的连接性,从而减轻了深度GCNN中过度平滑的效果。在八个基准数据集上进行节点和图形分类任务的大量实验证明了与最新方法相比,提出的方法的有效性。

Graph convolutional neural networks (GCNNs) have received much attention recently, owing to their capability in handling graph-structured data. Among the existing GCNNs, many methods can be viewed as instances of a neural message passing motif; features of nodes are passed around their neighbors, aggregated and transformed to produce better nodes' representations. Nevertheless, these methods seldom use node transition probabilities, a measure that has been found useful in exploring graphs. Furthermore, when the transition probabilities are used, their transition direction is often improperly considered in the feature aggregation step, resulting in an inefficient weighting scheme. In addition, although a great number of GCNN models with increasing level of complexity have been introduced, the GCNNs often suffer from over-fitting when being trained on small graphs. Another issue of the GCNNs is over-smoothing, which tends to make nodes' representations indistinguishable. This work presents a new method to improve the message passing process based on node transition probabilities by properly considering the transition direction, leading to a better weighting scheme in nodes' features aggregation compared to the existing counterpart. Moreover, we propose a novel regularization method termed DropNode to address the over-fitting and over-smoothing issues simultaneously. DropNode randomly discards part of a graph, thus it creates multiple deformed versions of the graph, leading to data augmentation regularization effect. Additionally, DropNode lessens the connectivity of the graph, mitigating the effect of over-smoothing in deep GCNNs. Extensive experiments on eight benchmark datasets for node and graph classification tasks demonstrate the effectiveness of the proposed methods in comparison with the state of the art.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源