论文标题
图形神经网络有证据意识到的假新闻检测
Evidence-aware Fake News Detection with Graph Neural Networks
论文作者
论文摘要
虚假新闻的流行和有害性一直是互联网上的关键问题,这又刺激了自动假新闻检测的发展。在本文中,我们专注于基于证据的假新闻检测,其中有几种证据被用来探究新闻的真实性(即主张)。大多数先前的方法首先采用顺序模型来嵌入语义信息,然后基于不同的注意机制捕获索赔证明相互作用。尽管它们有效,但他们仍然遭受两个主要弱点的困扰。首先,由于顺序模型的固有缺点,它们无法整合相关信息,这些信息散布在证据中以进行真实性检查。其次,他们忽略了可能无用甚至有害的证据中包含的许多多余信息。为了解决这些问题,我们提出了一个基于图的语义结构挖掘框架,即简要介绍。具体而言,与将索赔和证据视为序列的现有工作不同,我们将它们建模为图形结构化数据,并通过邻里传播捕获分散相关片段之间的长距离语义依赖。获得上下文语义信息后,我们的模型通过执行图形结构学习来降低信息冗余。最后,将细粒的语义表示被送入下游索赔 - 证据相互作用模块以进行预测。全面的实验表明,要超越最先进的实验。
The prevalence and perniciousness of fake news has been a critical issue on the Internet, which stimulates the development of automatic fake news detection in turn. In this paper, we focus on the evidence-based fake news detection, where several evidences are utilized to probe the veracity of news (i.e., a claim). Most previous methods first employ sequential models to embed the semantic information and then capture the claim-evidence interaction based on different attention mechanisms. Despite their effectiveness, they still suffer from two main weaknesses. Firstly, due to the inherent drawbacks of sequential models, they fail to integrate the relevant information that is scattered far apart in evidences for veracity checking. Secondly, they neglect much redundant information contained in evidences that may be useless or even harmful. To solve these problems, we propose a unified Graph-based sEmantic sTructure mining framework, namely GET in short. Specifically, different from the existing work that treats claims and evidences as sequences, we model them as graph-structured data and capture the long-distance semantic dependency among dispersed relevant snippets via neighborhood propagation. After obtaining contextual semantic information, our model reduces information redundancy by performing graph structure learning. Finally, the fine-grained semantic representations are fed into the downstream claim-evidence interaction module for predictions. Comprehensive experiments have demonstrated the superiority of GET over the state-of-the-arts.