论文标题
半监督任务驱动的医疗图像分割数据扩展
Semi-supervised Task-driven Data Augmentation for Medical Image Segmentation
论文作者
论文摘要
基于学习的细分方法通常需要大量带注释的培训数据,以便在测试时间良好。在医疗应用中,策划此类数据集并不是一个有利的选择,因为从专家那里获取大量注释的样本是耗时且昂贵的。因此,文献中已经提出了许多方法,用于学习有限的注释示例。不幸的是,文献中提出的方法尚未在图像分割的随机数据增强方面带来显着的收益,因为随机增强本身并没有产生高准确性。在这项工作中,我们提出了一种新型任务驱动的数据增强方法,用于使用有限的标记数据进行学习,其中合成数据生成器已针对分割任务进行了优化。提出的方法的发生器使用两组转换组模型强度和形状变化,作为加性强度转换和变形场。在半监督框架中,使用标记和未标记的示例优化了这两种转换。我们在三个医学数据集(即Cardic,Prostate和Pancreas)上进行的实验表明,在有限的注释设置中,提出的方法明显优于标准增强和半监督方法,用于图像分割。该代码可在https://github.com/krishnabits001/task$ \_$ driven$ __jumdataqu___jugnation上公开获得。
Supervised learning-based segmentation methods typically require a large number of annotated training data to generalize well at test time. In medical applications, curating such datasets is not a favourable option because acquiring a large number of annotated samples from experts is time-consuming and expensive. Consequently, numerous methods have been proposed in the literature for learning with limited annotated examples. Unfortunately, the proposed approaches in the literature have not yet yielded significant gains over random data augmentation for image segmentation, where random augmentations themselves do not yield high accuracy. In this work, we propose a novel task-driven data augmentation method for learning with limited labeled data where the synthetic data generator, is optimized for the segmentation task. The generator of the proposed method models intensity and shape variations using two sets of transformations, as additive intensity transformations and deformation fields. Both transformations are optimized using labeled as well as unlabeled examples in a semi-supervised framework. Our experiments on three medical datasets, namely cardic, prostate and pancreas, show that the proposed approach significantly outperforms standard augmentation and semi-supervised approaches for image segmentation in the limited annotation setting. The code is made publicly available at https://github.com/krishnabits001/task$\_$driven$\_$data$\_$augmentation.