论文标题
使用神经网络扩展答案集程序
Extending Answer Set Programs with Neural Networks
论文作者
论文摘要
低级感知与高级推理的整合是人工智能中最古老的问题之一。最近,提出了一些建议,以在复杂的神经网络体系结构中实施推理过程。尽管这些作品旨在扩展神经网络具有推理能力,但我们考虑的自然问题是:我们可以使用神经网络扩展答案集,以允许对神经网络输出进行复杂而高级的推理?作为初步结果,我们提出了Neurasp - 通过拥抱神经网络输出的神经网络的简单扩展,将神经网络输出视为答案集程序中原子事实的概率分布。我们表明,神经质不仅可以提高预训练的神经网络的感知准确性,而且还可以通过通过逻辑规则限制来帮助更好地训练神经网络。但是,由于内部使用象征性推理引擎,对神经拉斯的培训将比纯神经网络训练需要更多的时间。对于将来的工作,我们计划调查解决神经外的可伸缩性问题的潜在方法。一种潜在的方法是将逻辑程序直接嵌入神经网络中。在这条路线上,我们计划首先使用神经网络设计SAT求解器,然后扩展这样的求解器以允许逻辑程序。
The integration of low-level perception with high-level reasoning is one of the oldest problems in Artificial Intelligence. Recently, several proposals were made to implement the reasoning process in complex neural network architectures. While these works aim at extending neural networks with the capability of reasoning, a natural question that we consider is: can we extend answer set programs with neural networks to allow complex and high-level reasoning on neural network outputs? As a preliminary result, we propose NeurASP -- a simple extension of answer set programs by embracing neural networks where neural network outputs are treated as probability distributions over atomic facts in answer set programs. We show that NeurASP can not only improve the perception accuracy of a pre-trained neural network, but also help to train a neural network better by giving restrictions through logic rules. However, training with NeurASP would take much more time than pure neural network training due to the internal use of a symbolic reasoning engine. For future work, we plan to investigate the potential ways to solve the scalability issue of NeurASP. One potential way is to embed logic programs directly in neural networks. On this route, we plan to first design a SAT solver using neural networks, then extend such a solver to allow logic programs.