论文标题
在表示(反)对称函数上
On Representing (Anti)Symmetric Functions
论文作者
论文摘要
置换不变, - 等级和 - 反向功能以及反对称功能在量子物理,计算机视觉和其他学科中很重要。应用程序通常需要以下大多数或全部属性:(a)可以近似一类此类功能,例如所有连续函数,(b)只能表示(抗)对称函数,(c)用于计算近似值的快速算法,(d)表示代表本身是连续的或可区分的,(e)架构适合从数据中学习该函数。 (反)对称神经网络最近已开发和应用,并取得了巨大成功。已经证明了一些理论上的近似结果,但是许多问题仍然是开放的,尤其是对于一个以上维度和反对称情况的粒子,这项工作的重点是。更具体地说,我们在对称情况下得出了自然多项式近似,并基于抗对称情况的单个广义Slater决定因素。与某些以前的超额指数和不连续的近似不同,这些近似似乎是未来更紧密的界限的更有希望的基础。我们提供了均值的多层概念的完整而明确的普遍性证明,这意味着对称MLP和Ferminet的普遍性。
Permutation-invariant, -equivariant, and -covariant functions and anti-symmetric functions are important in quantum physics, computer vision, and other disciplines. Applications often require most or all of the following properties: (a) a large class of such functions can be approximated, e.g. all continuous function, (b) only the (anti)symmetric functions can be represented, (c) a fast algorithm for computing the approximation, (d) the representation itself is continuous or differentiable, (e) the architecture is suitable for learning the function from data. (Anti)symmetric neural networks have recently been developed and applied with great success. A few theoretical approximation results have been proven, but many questions are still open, especially for particles in more than one dimension and the anti-symmetric case, which this work focusses on. More concretely, we derive natural polynomial approximations in the symmetric case, and approximations based on a single generalized Slater determinant in the anti-symmetric case. Unlike some previous super-exponential and discontinuous approximations, these seem a more promising basis for future tighter bounds. We provide a complete and explicit universality proof of the Equivariant MultiLayer Perceptron, which implies universality of symmetric MLPs and the FermiNet.