论文标题

Orphicx:一种因果关系启发的潜在变量模型,用于解释图形神经网络

OrphicX: A Causality-Inspired Latent Variable Model for Interpreting Graph Neural Networks

论文作者

Lin, Wanyu, Lan, Hao, Wang, Hao, Li, Baochun

论文摘要

本文提出了一个新的解释框架,称为Orphicx,用于基于学到的潜在因果因素为任何图形神经网络(GNN)生成因果解释。具体而言,我们构建了一个独特的生成模型,并设计了一个目标功能,该模型鼓励生成模型产生因果,紧凑和忠实的解释。这是通过最大化信息流量测量值来隔离图表潜在空间中的因果因素来实现的。我们从理论上分析了提出的因果图中的原因效应关系,将节点属性确定为图和GNN预测之间的混杂因子,并通过利用后门调整公式来避免这种混杂效应。我们的框架与任何GNN均兼容,并且不需要访问目标GNN产生其预测的过程。此外,它不依赖于解释功能的线性独立假设,也不需要在图形学习任务上进行先验知识。我们在图形数据上显示了Orphicx的概念概念。特别是,我们分析了从分子图的解释获得的解释子图(即mutag),并使用经常发生的子图模式进行了定量评估解释性能。从经验上讲,我们表明OrphICX可以有效地识别产生因果解释的因果语义,从而极大地表现其替代方案。

This paper proposes a new eXplanation framework, called OrphicX, for generating causal explanations for any graph neural networks (GNNs) based on learned latent causal factors. Specifically, we construct a distinct generative model and design an objective function that encourages the generative model to produce causal, compact, and faithful explanations. This is achieved by isolating the causal factors in the latent space of graphs by maximizing the information flow measurements. We theoretically analyze the cause-effect relationships in the proposed causal graph, identify node attributes as confounders between graphs and GNN predictions, and circumvent such confounder effect by leveraging the backdoor adjustment formula. Our framework is compatible with any GNNs, and it does not require access to the process by which the target GNN produces its predictions. In addition, it does not rely on the linear-independence assumption of the explained features, nor require prior knowledge on the graph learning tasks. We show a proof-of-concept of OrphicX on canonical classification problems on graph data. In particular, we analyze the explanatory subgraphs obtained from explanations for molecular graphs (i.e., Mutag) and quantitatively evaluate the explanation performance with frequently occurring subgraph patterns. Empirically, we show that OrphicX can effectively identify the causal semantics for generating causal explanations, significantly outperforming its alternatives.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源