论文标题

套制:一种无搜索的概率数据增强方法

UniformAugment: A Search-free Probabilistic Data Augmentation Approach

论文作者

LingChen, Tom Ching, Khonsari, Ava, Lashkari, Amirreza, Nazari, Mina Rafi, Sambee, Jaspreet Singh, Nascimento, Mario A.

论文摘要

已显示增强培训数据集可提高多个计算机视觉任务的学习效率。良好的增强产生了一个增强数据集,该数据集在保留原始数据集的统计属性的同时增加了可变性。某些技术,例如自动说明和快速自动说明,已经引入了搜索阶段,以找到一组适合给定模型和数据集的合适的增强策略。这是以出色的计算开销为代价的,总计数千个GPU小时。最近,提议Randaugment通过几个超参数近似搜索空间来大大加速搜索阶段,但仍会产生调整这些搜索空间的不可忽略的成本。在本文中,我们表明,在假设增强空间近似分布不变的假设下,在增强转换的连续空间上进行了均匀的采样,足以训练高效的模型。基于这一结果,我们提出了统一测量,一种自动数据增强方法完全避免了搜索阶段。除了讨论支持我们的方法的理论基础外,我们还使用标准数据集以及建立的图像分类模型,以表明一体型的有效性与上述方法相当,同时仍然由于不需要任何搜索而变得高效。

Augmenting training datasets has been shown to improve the learning effectiveness for several computer vision tasks. A good augmentation produces an augmented dataset that adds variability while retaining the statistical properties of the original dataset. Some techniques, such as AutoAugment and Fast AutoAugment, have introduced a search phase to find a set of suitable augmentation policies for a given model and dataset. This comes at the cost of great computational overhead, adding up to several thousand GPU hours. More recently RandAugment was proposed to substantially speedup the search phase by approximating the search space by a couple of hyperparameters, but still incurring non-negligible cost for tuning those. In this paper we show that, under the assumption that the augmentation space is approximately distribution invariant, a uniform sampling over the continuous space of augmentation transformations is sufficient to train highly effective models. Based on that result we propose UniformAugment, an automated data augmentation approach that completely avoids a search phase. In addition to discussing the theoretical underpinning supporting our approach, we also use the standard datasets, as well as established models for image classification, to show that UniformAugment's effectiveness is comparable to the aforementioned methods, while still being highly efficient by virtue of not requiring any search.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源