论文标题

卷积网络织物用标签噪声修剪

Convolutional Network Fabric Pruning With Label Noise

论文作者

Benjelloun, Ilias, Lamiroy, Bart, Koudou, Efoevi

论文摘要

本文在存在嘈杂的培训和测试数据的情况下提出了卷积网络织物(CNF)的迭代修剪策略。随着神经网络模型尺寸的不断增加,各种作者开发了修剪方法,以建立更紧凑的网络结构,而需要更少的资源,同时保持绩效。正如我们在本文中所显示的那样,由于它们的内在结构和功能,卷积网络织物是修剪的理想候选者。我们提出了一系列修剪策略,可以通过修剪整个卷积过滤器或单个权重来大大减少最终网络规模和所需的培训时间,以使网格在视觉上仍然可以理解,但总体执行质量保持在可控制的范围内。我们的方法可以在训练期间迭代应用,以便网络复杂性迅速降低,从而节省计算时间。该论文既解决数据依赖性和数据依赖性策略,又解决了训练或测试数据包含注释错误时实验最有效的方法。

This paper presents an iterative pruning strategy for Convolutional Network Fabrics (CNF) in presence of noisy training and testing data. With the continuous increase in size of neural network models, various authors have developed pruning approaches to build more compact network structures requiring less resources, while preserving performance. As we show in this paper, because of their intrinsic structure and function, Convolutional Network Fabrics are ideal candidates for pruning. We present a series of pruning strategies that can significantly reduce both the final network size and required training time by pruning either entire convolutional filters or individual weights, so that the grid remains visually understandable but that overall execution quality stays within controllable boundaries. Our approach can be iteratively applied during training so that the network complexity decreases rapidly, saving computational time. The paper addresses both data-dependent and dataindependent strategies, and also experimentally establishes the most efficient approaches when training or testing data contain annotation errors.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源