论文标题

使用深度学习的自动CAD-RADS得分

Automatic CAD-RADS Scoring Using Deep Learning

论文作者

Denzinger, Felix, Wels, Michael, Breininger, Katharina, Gülsün, Mehmet A., Schöbinger, Max, André, Florian, Buß, Sebastian, Görich, Johannes, Sühling, Michael, Maier, Andreas

论文摘要

冠状动脉血管造影(CCTA)已确立了其作为诊断冠状动脉疾病(CAD)的非侵入性形态的作用。开发了CAD报告和数据系统(CAD-RADS),以根据CCTA发现标准化沟通和帮助制定决策。 CAD-RADS评分是通过对所有冠状动脉血管的手动评估和冠状动脉树内病变的分级确定的。 我们提出了一种自下而上的方法,可以使用在冠状动脉的细分表现上进行深入学习的深度学习预测。该方法仅依赖于先前的全自动中心线提取和段标记,并在多任务学习设置中预测细分狭窄程度和整体钙化等级。 我们评估了由2,867名患者组成的数据收集的方法。关于确定CAD-RADS评分的患者的任务表明需要进一步侵入性研究,我们的方法达到曲线下的面积(AUC)为0.923,AUC为0.914,以确定患者是否患有CAD。这种性能水平使我们的方法可以在完全自动化的筛选设置中使用或协助诊断CCTA阅读,尤其是由于其神经体系结构设计,这允许全面预测。

Coronary CT angiography (CCTA) has established its role as a non-invasive modality for the diagnosis of coronary artery disease (CAD). The CAD-Reporting and Data System (CAD-RADS) has been developed to standardize communication and aid in decision making based on CCTA findings. The CAD-RADS score is determined by manual assessment of all coronary vessels and the grading of lesions within the coronary artery tree. We propose a bottom-up approach for fully-automated prediction of this score using deep-learning operating on a segment-wise representation of the coronary arteries. The method relies solely on a prior fully-automated centerline extraction and segment labeling and predicts the segment-wise stenosis degree and the overall calcification grade as auxiliary tasks in a multi-task learning setup. We evaluate our approach on a data collection consisting of 2,867 patients. On the task of identifying patients with a CAD-RADS score indicating the need for further invasive investigation our approach reaches an area under curve (AUC) of 0.923 and an AUC of 0.914 for determining whether the patient suffers from CAD. This level of performance enables our approach to be used in a fully-automated screening setup or to assist diagnostic CCTA reading, especially due to its neural architecture design -- which allows comprehensive predictions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源