论文标题
Red Dragon AI在TextGraphs 2020共享任务:LIT:用于多跳的解释排名的LSTM-Interleaved Transformer
Red Dragon AI at TextGraphs 2020 Shared Task: LIT : LSTM-Interleaved Transformer for Multi-Hop Explanation Ranking
论文作者
论文摘要
可以解释科学问题的问题是一项具有挑战性的任务,需要对大量事实句子进行多跳推断。为了应对隔离查看每个查询文档对的方法的局限性,我们提出了LSTM交换变压器,该变压器结合了跨文档相互作用,以改善多跳等级。点亮的体系结构可以利用重新排列设置中的先前排名位置。我们的模型在2020 TextGraphs共享任务的当前排行榜上具有竞争力,实现了0.5607的测试集,如果我们在比赛截止日期之前提交了第三名。我们的代码实现可在https://github.com/mdda/worldtree_corpus/tree/tree/textgraphs_2020提供。
Explainable question answering for science questions is a challenging task that requires multi-hop inference over a large set of fact sentences. To counter the limitations of methods that view each query-document pair in isolation, we propose the LSTM-Interleaved Transformer which incorporates cross-document interactions for improved multi-hop ranking. The LIT architecture can leverage prior ranking positions in the re-ranking setting. Our model is competitive on the current leaderboard for the TextGraphs 2020 shared task, achieving a test-set MAP of 0.5607, and would have gained third place had we submitted before the competition deadline. Our code implementation is made available at https://github.com/mdda/worldtree_corpus/tree/textgraphs_2020