论文标题

通过小型网络训练的神经模型拆除复杂网络

Dismantling Complex Networks by a Neural Model Trained from Tiny Networks

论文作者

Zhang, Jiazheng, Wang, Bang

论文摘要

我们可以采用一个神经模型有效拆除许多复杂但独特的网络吗?本文提供了一个肯定的答案。可以将各种现实世界系统抽象为复杂网络,该网络每个由许多功能节点和边缘组成。渗透理论表明,仅删除少数重要节点会导致整个网络的崩溃。但是,由于NP硬度,找到最少数量的重要节点是一项艰巨的任务。先前的研究提出了许多中心度措施和启发式算法来解决该网络拆除(ND)问题。与他们的不同,本文试图通过设计可以从微小的合成网络训练的神经模型来处理ND任务,但将用于各种现实世界网络。一见钟情,这似乎是一个令人沮丧的任务,因为在不同的现实世界网络中,网络大小和拓扑都大不相同。尽管如此,本文启动了设计和培训神经影响排名模型(NIRM)的有见地的努力。与最先进的竞争对手相比,在15个现实世界网络上进行的实验证明了其有效性以拆除网络的有效性。其成功的关键在于,除了我们在培训数据集构建中,我们的Nirm还可以有效地编码排​​名节点的本地结构和全球拓扑信号。

Can we employ one neural model to efficiently dismantle many complex yet unique networks? This article provides an affirmative answer. Diverse real-world systems can be abstracted as complex networks each consisting of many functional nodes and edges. Percolation theory has indicated that removing only a few vital nodes can cause the collapse of whole network. However, finding the least number of such vital nodes is a rather challenging task for large networks due to its NP-hardness. Previous studies have proposed many centrality measures and heuristic algorithms to tackle this network dismantling (ND) problem. Different from theirs, this article tries to approach the ND task by designing a neural model which can be trained from tiny synthetic networks but will be applied for various real-world networks. It seems a discouraging mission at first sight, as network sizes and topologies are quite different across distinct real-world networks. Nonetheless, this article initiates insightful efforts of designing and training a neural influence ranking model (NIRM). Experiments on fifteen real-world networks validate its effectiveness for its mostly requiring fewer vital nodes to dismantle a network, compared with the state-of-the-art competitors. The key to its success lies in that our NIRM can efficiently encode both local structural and global topological signals for ranking nodes, in addition to our innovative labelling method in training dataset construction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源