论文标题

激活功能:它们代表神经网络模块化性质与任务性能之间的权衡

Activation Functions: Do They Represent A Trade-Off Between Modular Nature of Neural Networks And Task Performance

论文作者

Aswani, Himanshu Pradeep, Sethi, Amit

论文摘要

当前的研究表明,设计神经网络架构的关键因素涉及每个卷积层选择过滤器数量,每个完全连接的层,辍学和修剪的隐藏神经元数。在大多数情况下,默认激活函数是RELU,因为它在经验上显示了更快的训练收敛。如果人们旨在希望在神经网络中渴望更好的模块化结构,我们探索Relu是否是最佳选择。

Current research suggests that the key factors in designing neural network architectures involve choosing number of filters for every convolution layer, number of hidden neurons for every fully connected layer, dropout and pruning. The default activation function in most cases is the ReLU, as it has empirically shown faster training convergence. We explore whether ReLU is the best choice if one is aiming to desire better modularity structure within a neural network.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源