论文标题
具有生成神经元的自组织操作神经网络
Self-Organized Operational Neural Networks with Generative Neurons
论文作者
论文摘要
最近提出了操作神经网络(ONNS)来解决常规卷积神经网络(CNN)的众所周知的局限性和缺点,例如具有唯一线性神经元模型的网络同质性。 ONN是具有广义神经元模型的异源网络,可以封装任何一组非线性操作员,以提高多样性,并学习具有最小网络复杂性和训练数据的高度复杂和多模式功能或空间。但是,贪婪的迭代搜索方法(GIS)方法是用于在ONNS中找到最佳操作员的搜索方法,需要许多培训课程,每层找到一个单一操作员集。这不仅是计算要求的,而且网络异质性也受到限制,因为然后将使用相同的一组运算符用于每一层中的所有神经元。此外,ONN的性能直接取决于所使用的操作员集库,该库引入了性能降低的一定风险,尤其是当库中缺少特定任务所需的最佳操作员集时。为了解决这些问题并达到最终的异质性水平,以提高网络多样性以及计算效率,在这项研究中,我们提出了具有生成性神经元的自组织的ONN(自我强调),这些神经元能够在训练过程中适应(优化)每个连接的节点操作员。因此,自我结合可以具有手头学习问题所需的最大异质性水平。此外,此功能可以使需要在库中拥有固定的操作员集库和先前的操作员搜索,以找到最佳的运算符集。我们进一步制定了训练方法,以通过自我支撑的操作层将误差进行后传播。
Operational Neural Networks (ONNs) have recently been proposed to address the well-known limitations and drawbacks of conventional Convolutional Neural Networks (CNNs) such as network homogeneity with the sole linear neuron model. ONNs are heterogenous networks with a generalized neuron model that can encapsulate any set of non-linear operators to boost diversity and to learn highly complex and multi-modal functions or spaces with minimal network complexity and training data. However, Greedy Iterative Search (GIS) method, which is the search method used to find optimal operators in ONNs takes many training sessions to find a single operator set per layer. This is not only computationally demanding, but the network heterogeneity is also limited since the same set of operators will then be used for all neurons in each layer. Moreover, the performance of ONNs directly depends on the operator set library used, which introduces a certain risk of performance degradation especially when the optimal operator set required for a particular task is missing from the library. In order to address these issues and achieve an ultimate heterogeneity level to boost the network diversity along with computational efficiency, in this study we propose Self-organized ONNs (Self-ONNs) with generative neurons that have the ability to adapt (optimize) the nodal operator of each connection during the training process. Therefore, Self-ONNs can have an utmost heterogeneity level required by the learning problem at hand. Moreover, this ability voids the need of having a fixed operator set library and the prior operator search within the library in order to find the best possible set of operators. We further formulate the training method to back-propagate the error through the operational layers of Self-ONNs.