论文标题

需要:将分层变压器引入眼科疾病诊断

NEEDED: Introducing Hierarchical Transformer to Eye Diseases Diagnosis

论文作者

Ye, Xu, Xiao, Meng, Ning, Zhiyuan, Dai, Weiwei, Cui, Wenjuan, Du, Yi, Zhou, Yuanchun

论文摘要

随着自然语言处理技术(NLP)的发展,使用眼科电子病历(OEMR)自动诊断眼病。它旨在分别评估患者双眼的状况,并将其作为本文中特定的多标签分类任务提出。尽管在其他疾病方面有一些相关的研究,但对眼病的自动诊断表现出独特的特征。首先,两只眼睛的描述都在OEMR文档中混合在一起,并带有自由文本和模板无症状的描述,从而导致稀疏性和信息混乱。其次,OEMR文档包含描述的多个部分,并且文档长度很长。第三,为疾病诊断模型提供解释性至关重要。为了克服这些挑战,我们提出了需要有效的自动眼病诊断框架。在此框架中,集成了一个预处理模块以提高信息的密度和质量。然后,我们设计了一个分层变压器结构,用于学习OEMR文档中每个句子的上下文化表示。对于诊断部分,我们提出了一个基于注意力的预测因子,该预测因素可以通过获得疾病特异性信息来实现可追溯的诊断。实际数据集的实验以及与多个基线模型进行比较显示了我们框架的优势和解释性。

With the development of natural language processing techniques(NLP), automatic diagnosis of eye diseases using ophthalmology electronic medical records (OEMR) has become possible. It aims to evaluate the condition of both eyes of a patient respectively, and we formulate it as a particular multi-label classification task in this paper. Although there are a few related studies in other diseases, automatic diagnosis of eye diseases exhibits unique characteristics. First, descriptions of both eyes are mixed up in OEMR documents, with both free text and templated asymptomatic descriptions, resulting in sparsity and clutter of information. Second, OEMR documents contain multiple parts of descriptions and have long document lengths. Third, it is critical to provide explainability to the disease diagnosis model. To overcome those challenges, we present an effective automatic eye disease diagnosis framework, NEEDED. In this framework, a preprocessing module is integrated to improve the density and quality of information. Then, we design a hierarchical transformer structure for learning the contextualized representations of each sentence in the OEMR document. For the diagnosis part, we propose an attention-based predictor that enables traceable diagnosis by obtaining disease-specific information. Experiments on the real dataset and comparison with several baseline models show the advantage and explainability of our framework.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源