论文标题
生物医学关系提取方法和知识图创建模型的比较
Comparison of biomedical relationship extraction methods and models for knowledge graph creation
论文作者
论文摘要
生物医学研究正在以指数级的速度增长,以至于科学家,研究人员和从业人员不再能够应对该领域中发表的文献的数量。文献中介绍的知识需要以一种可以轻松找到,访问和验证的主张和假设的方式进行系统化。知识图可以为文学的语义知识表示提供这样的框架。但是,为了构建知识图,有必要将知识作为生物医学实体之间的关系提取并使实体和关系类型正常化。在本文中,我们介绍并比较了几个基于规则和机器学习的基于规则和基于机器的学习(幼稚的贝叶斯,随机森林,作为传统机器学习方法的示例和Distilbert,PubMedbert,T5和基于Scifive的模型作为现代深度学习变压器的示例),用于从生物医学文献中可扩展的关系提取的方法,以及整合到知识图中。我们检查了这些方法对不平衡且相当小的数据集的弹性。我们的实验表明,基于变压器的模型处理得很好(由于大型数据集上的预训练)和不平衡的数据集。最佳性能模型是基于PubMedbert的模型对平衡数据进行了微调,报告的F1得分为0.92。基于Distilbert的模型之后,F1得分为0.89,执行速度更快,资源要求较低。基于BERT的模型的性能比基于T5的生成模型更好。
Biomedical research is growing at such an exponential pace that scientists, researchers, and practitioners are no more able to cope with the amount of published literature in the domain. The knowledge presented in the literature needs to be systematized in such a way that claims and hypotheses can be easily found, accessed, and validated. Knowledge graphs can provide such a framework for semantic knowledge representation from literature. However, in order to build a knowledge graph, it is necessary to extract knowledge as relationships between biomedical entities and normalize both entities and relationship types. In this paper, we present and compare few rule-based and machine learning-based (Naive Bayes, Random Forests as examples of traditional machine learning methods and DistilBERT, PubMedBERT, T5 and SciFive-based models as examples of modern deep learning transformers) methods for scalable relationship extraction from biomedical literature, and for the integration into the knowledge graphs. We examine how resilient are these various methods to unbalanced and fairly small datasets. Our experiments show that transformer-based models handle well both small (due to pre-training on a large dataset) and unbalanced datasets. The best performing model was the PubMedBERT-based model fine-tuned on balanced data, with a reported F1-score of 0.92. DistilBERT-based model followed with F1-score of 0.89, performing faster and with lower resource requirements. BERT-based models performed better then T5-based generative models.