论文标题

Equivariant Hypergraph Diffusion Neural Operators

论文作者

Wang, Peihao, Yang, Shenghao, Liu, Yunyu, Wang, Zhangyang, Li, Pan

论文摘要

使用神经网络编码HyperGraphs的HyperGraph神经网络(HNNS)为建模数据中的高阶关系提供了一种有希望的方法,并进一步解决了基于此类高阶关系的相关预测任务。但是,实践中的高阶关系包含复杂的模式,通常是高度不规则的。因此,设计一个足以表达这些关系的同时保持计算效率的HNN通常是具有挑战性的。受到超图扩散算法的启发,这项工作提出了一种名为ED-HNN的新型HNN体系结构,该结构可证明可以代表任何可以建模广泛的高阶关系的连续均值超差扩散操作员。 ED-HNN可以通过将超图的星形扩展与传递神经网络的标准消息相结合来有效地实现。 ED-HNN进一步在处理异性超图和建造深层模型方面表现出了极大的优势。我们评估了在9个现实世界中数据集上的deD-HNN分类。 ED-HNN均匀地胜过这9个数据集的最佳基线,并且在其中四个数据集中的预测准确性超过2 \%$ \ uparrow $。

Hypergraph neural networks (HNNs) using neural networks to encode hypergraphs provide a promising way to model higher-order relations in data and further solve relevant prediction tasks built upon such higher-order relations. However, higher-order relations in practice contain complex patterns and are often highly irregular. So, it is often challenging to design an HNN that suffices to express those relations while keeping computational efficiency. Inspired by hypergraph diffusion algorithms, this work proposes a new HNN architecture named ED-HNN, which provably represents any continuous equivariant hypergraph diffusion operators that can model a wide range of higher-order relations. ED-HNN can be implemented efficiently by combining star expansions of hypergraphs with standard message passing neural networks. ED-HNN further shows great superiority in processing heterophilic hypergraphs and constructing deep models. We evaluate ED-HNN for node classification on nine real-world hypergraph datasets. ED-HNN uniformly outperforms the best baselines over these nine datasets and achieves more than 2\%$\uparrow$ in prediction accuracy over four datasets therein.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源