论文标题
基于删除的持续图表示学习
Disentangle-based Continual Graph Representation Learning
论文作者
论文摘要
图形嵌入方法(GE)方法将图(和/或边缘)嵌入到低维语义空间中,并显示出其在建模多关系数据方面的有效性。但是,现有的GE模型在现实世界应用中不实用,因为它忽略了传入数据的流媒体性质。为了解决这个问题,我们研究了连续图表学习的问题,该学习旨在不断地培训新数据的GE模型,以学习不断出现的多个关系数据,同时避免灾难性地忘记旧的学识。此外,我们提出了一个基于解散的持续图表学习(DICGRL)框架,该框架灵感来自人类学习程序知识的能力。实验结果表明,DICGRL可以有效地减轻灾难性遗忘问题,并表现优于最先进的持续学习模型。
Graph embedding (GE) methods embed nodes (and/or edges) in graph into a low-dimensional semantic space, and have shown its effectiveness in modeling multi-relational data. However, existing GE models are not practical in real-world applications since it overlooked the streaming nature of incoming data. To address this issue, we study the problem of continual graph representation learning which aims to continually train a GE model on new data to learn incessantly emerging multi-relational data while avoiding catastrophically forgetting old learned knowledge. Moreover, we propose a disentangle-based continual graph representation learning (DiCGRL) framework inspired by the human's ability to learn procedural knowledge. The experimental results show that DiCGRL could effectively alleviate the catastrophic forgetting problem and outperform state-of-the-art continual learning models.