论文标题
苏木精和曙红染色的组织学图像中核实例分割的深神经网络修剪
Deep Neural Network Pruning for Nuclei Instance Segmentation in Hematoxylin & Eosin-Stained Histological Images
论文作者
论文摘要
最近,修剪深度神经网络(DNNS)因提高准确性和泛化功率,降低网络大小以及提高专业硬件的推理速度而受到了很多关注。尽管修剪主要在计算机视觉任务上进行了测试,但几乎没有探索其在医学图像分析的上下文中的应用。这项工作研究了众所周知的修剪技术,即在层和网络范围内修剪的影响,对核实例分割性能在组织学图像中的影响。我们利用的实例分割模型由两个主要分支组成:(1)语义分割分支,(2)深层回归分支。我们研究重量修剪对两个分支的性能的影响分别和最终核实例分割结果。在两个公开可用的数据集上进行了评估,我们的结果表明,层修剪的性能比在网络范围内修剪较小的压缩比(CRS)的性能稍好,而对于较大的CRS,网络范围的修剪会产生出色的性能。对于语义分割,深度回归和最终实例分割,可以通过层的修剪来修剪93.75%,95%和80%的模型权重,而相应模型的性能降低了不到2%。
Recently, pruning deep neural networks (DNNs) has received a lot of attention for improving accuracy and generalization power, reducing network size, and increasing inference speed on specialized hardwares. Although pruning was mainly tested on computer vision tasks, its application in the context of medical image analysis has hardly been explored. This work investigates the impact of well-known pruning techniques, namely layer-wise and network-wide magnitude pruning, on the nuclei instance segmentation performance in histological images. Our utilized instance segmentation model consists of two main branches: (1) a semantic segmentation branch, and (2) a deep regression branch. We investigate the impact of weight pruning on the performance of both branches separately and on the final nuclei instance segmentation result. Evaluated on two publicly available datasets, our results show that layer-wise pruning delivers slightly better performance than networkwide pruning for small compression ratios (CRs) while for large CRs, network-wide pruning yields superior performance. For semantic segmentation, deep regression and final instance segmentation, 93.75 %, 95 %, and 80 % of the model weights can be pruned by layer-wise pruning with less than 2 % reduction in the performance of respective models.