论文标题

量化图形嵌入中的隐私泄漏

Quantifying Privacy Leakage in Graph Embedding

论文作者

Duddu, Vasisht, Boutet, Antoine, Shejwalkar, Virat

论文摘要

已经提出了图形嵌入将图形数据映射到低维空间以进行下游处理(例如节点分类或链接预测)。随着个人数据收集的越来越多,可以对私人和敏感数据进行培训。我们第一次通过针对图形神经网络的三个推理攻击来量化图形嵌入中的隐私泄漏。我们建议会员推理攻击来推断与单个用户数据相对应的图节点是否是模型培训的成员。我们考虑了一个黑框设置,对手可以利用输出预测分数,以及一个对手也可以访问已发布节点嵌入的白框设置。通过利用图形嵌入留下的火车和测试数据记录之间的可区分足迹,此攻击可提供高达28%(BlackBox)36%(WhiteBox)的准确性。我们提出了一个图形重建攻击,其中对手旨在重建给定相应的图形嵌入的目标图。在这里,对手可以用超过80%的精度重建图形,并在两个节点之间的链接推理比随机猜测高30%左右。然后,我们提出了一种属性推理攻击,其中对手旨在推断敏感属性。我们表明,图形嵌入与节点属性密切相关,使对手推断敏感信息(例如性别或位置)。

Graph embeddings have been proposed to map graph data to low dimensional space for downstream processing (e.g., node classification or link prediction). With the increasing collection of personal data, graph embeddings can be trained on private and sensitive data. For the first time, we quantify the privacy leakage in graph embeddings through three inference attacks targeting Graph Neural Networks. We propose a membership inference attack to infer whether a graph node corresponding to individual user's data was member of the model's training or not. We consider a blackbox setting where the adversary exploits the output prediction scores, and a whitebox setting where the adversary has also access to the released node embeddings. This attack provides an accuracy up to 28% (blackbox) 36% (whitebox) beyond random guess by exploiting the distinguishable footprint between train and test data records left by the graph embedding. We propose a Graph Reconstruction attack where the adversary aims to reconstruct the target graph given the corresponding graph embeddings. Here, the adversary can reconstruct the graph with more than 80% of accuracy and link inference between two nodes around 30% more confidence than a random guess. We then propose an attribute inference attack where the adversary aims to infer a sensitive attribute. We show that graph embeddings are strongly correlated to node attributes letting the adversary inferring sensitive information (e.g., gender or location).

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源