论文标题

DAG-WGAN:Wasserstein生成对抗网络的因果结构学习

DAG-WGAN: Causal Structure Learning With Wasserstein Generative Adversarial Networks

论文作者

Petkov, Hristo, Hanley, Colin, Dong, Feng

论文摘要

组合搜索空间对从数据学习因果关系提出了重大挑战。最近,该问题已将其提出为具有循环限制的连续优化框架,从而探索了深层生成模型,以更好地捕获数据样本分布并支持发现定向的无环图(DAG),以忠实地代表基础数据分布。但是,到目前为止,还没有研究通过生成模型研究Wasserstein距离进行因果结构学习。本文提出了一个名为dag-wgan的新模型,该模型结合了基于Wasserstein的对抗性损失,这是一种自动编码器架构,并结合了极限限制。 Dag-Wgan同时学习因果结构并通过利用Wasserstein距离度量的强度来提高其数据生成能力。与其他模型相比,它可以很好地缩放并处理连续和离散数据。我们的实验已经针对最先进的做法评估了达格甘甘,并证明了其良好的性能。

The combinatorial search space presents a significant challenge to learning causality from data. Recently, the problem has been formulated into a continuous optimization framework with an acyclicity constraint, allowing for the exploration of deep generative models to better capture data sample distributions and support the discovery of Directed Acyclic Graphs (DAGs) that faithfully represent the underlying data distribution. However, so far no study has investigated the use of Wasserstein distance for causal structure learning via generative models. This paper proposes a new model named DAG-WGAN, which combines the Wasserstein-based adversarial loss, an auto-encoder architecture together with an acyclicity constraint. DAG-WGAN simultaneously learns causal structures and improves its data generation capability by leveraging the strength from the Wasserstein distance metric. Compared with other models, it scales well and handles both continuous and discrete data. Our experiments have evaluated DAG-WGAN against the state-of-the-art and demonstrated its good performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源