论文标题
关于神经网络内核和存储容量问题
On neural network kernels and the storage capacity problem
论文作者
论文摘要
在此简短的说明中,我们可以衡量宽两层的神经网络中关于存储容量问题的工作与广泛神经网络内核界限的快速成长的文献之间的联系。具体而言,我们观察到统计力学文献中研究的“有效顺序参数”完全等同于无限宽度的神经网络高斯过程内核。这种对应关系连接了宽两层神经网络的表达性和训练性。
In this short note, we reify the connection between work on the storage capacity problem in wide two-layer treelike neural networks and the rapidly-growing body of literature on kernel limits of wide neural networks. Concretely, we observe that the "effective order parameter" studied in the statistical mechanics literature is exactly equivalent to the infinite-width Neural Network Gaussian Process Kernel. This correspondence connects the expressivity and trainability of wide two-layer neural networks.