论文标题

分子注意变压器

Molecule Attention Transformer

论文作者

Maziarka, Łukasz, Danel, Tomasz, Mucha, Sławomir, Rataj, Krzysztof, Tabor, Jacek, Jastrzębski, Stanisław

论文摘要

设计一个单个神经网络体系结构,该架构在一系列分子属性预测任务上进行竞争性能仍然是一个开放的挑战,其解决方案可能会解锁在药物发现行业中对深度学习的广泛使用。为了朝着这个目标迈进,我们提出了分子注意变压器(MAT)。我们的关键创新是使用原子间距离和分子图结构来增强变压器的注意力机制。实验表明,MAT在各种分子预测任务方面竞争性能。最重要的是,通过简单的自我监督预处理,MAT只需要调整几个高参数值才能在下游任务上实现最先进的性能。最后,我们表明,从化学的角度来看,MAT学到的注意力可以解释。

Designing a single neural network architecture that performs competitively across a range of molecule property prediction tasks remains largely an open challenge, and its solution may unlock a widespread use of deep learning in the drug discovery industry. To move towards this goal, we propose Molecule Attention Transformer (MAT). Our key innovation is to augment the attention mechanism in Transformer using inter-atomic distances and the molecular graph structure. Experiments show that MAT performs competitively on a diverse set of molecular prediction tasks. Most importantly, with a simple self-supervised pretraining, MAT requires tuning of only a few hyperparameter values to achieve state-of-the-art performance on downstream tasks. Finally, we show that attention weights learned by MAT are interpretable from the chemical point of view.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源