论文标题
Multichexnet:一个多任务学习深网,用于肺炎疾病的X射线扫描诊断
MultiCheXNet: A Multi-Task Learning Deep Network For Pneumonia-like Diseases Diagnosis From X-ray Scans
论文作者
论文摘要
我们提出了Multichexnet,这是一种端到端的多任务学习模型,能够利用一种神经体系结构中的肺炎样疾病的不同X射线数据集,同时执行三个任务;诊断,分割和定位。我们体系结构中的共同编码器可以捕获不同任务中存在的有用的常见功能。通用编码器具有有效计算的另一个优点,与单独的模型相比,该计算加快了推理时间。然后,专门的解码器头可以捕获特定于任务的功能。我们采用强迫解决损害细分和本地化绩效的负面样本问题。最后,我们采用转移学习来微调分类器在看不见的肺炎疾病中。 MTL体系结构可以在关节或关节标记的数据集上进行培训。对体系结构的培训遵循精心设计的协议,该协议在集成到联合MTL模型中之前,在专用数据集上训练不同的子模型。我们的实验设置涉及多种数据集,其中将3个任务的基线性能与MTL体系结构的性能进行了比较。此外,我们从单个分类器模型和MTL体系结构分类头评估了转移学习模式到Covid-19数据集。
We present MultiCheXNet, an end-to-end Multi-task learning model, that is able to take advantage of different X-rays data sets of Pneumonia-like diseases in one neural architecture, performing three tasks at the same time; diagnosis, segmentation and localization. The common encoder in our architecture can capture useful common features present in the different tasks. The common encoder has another advantage of efficient computations, which speeds up the inference time compared to separate models. The specialized decoders heads can then capture the task-specific features. We employ teacher forcing to address the issue of negative samples that hurt the segmentation and localization performance. Finally,we employ transfer learning to fine tune the classifier on unseen pneumonia-like diseases. The MTL architecture can be trained on joint or dis-joint labeled data sets. The training of the architecture follows a carefully designed protocol, that pre trains different sub-models on specialized datasets, before being integrated in the joint MTL model. Our experimental setup involves variety of data sets, where the baseline performance of the 3 tasks is compared to the MTL architecture performance. Moreover, we evaluate the transfer learning mode to COVID-19 data set,both from individual classifier model, and from MTL architecture classification head.