论文标题
基于注意的序列数据的神经功能袋学习
Attention-based Neural Bag-of-Features Learning for Sequence Data
论文作者
论文摘要
在本文中,我们提出了2D注意(2DA),这是序列数据的一般注意力公式,它充当一个互补的计算块,可以检测并关注给定学习目标的相关信息来源。提出的注意模块被纳入最近提出的特征神经袋(NBOF)模型,以增强其学习能力。由于2DA充当插件层,因此将其注入NBOF模型的不同计算阶段会导致不同的2DA-NBOF体系结构,每个架构都具有独特的解释。与现有方法相比,我们在财务预测,音频分析以及医学诊断问题方面进行了广泛的实验,以对拟议的配方进行基准测试,包括广泛使用的封闭式复发单位。我们的经验分析表明,提出的注意力配方不仅可以改善NBOF模型的性能,而且还可以使其对嘈杂的数据有弹性。
In this paper, we propose 2D-Attention (2DA), a generic attention formulation for sequence data, which acts as a complementary computation block that can detect and focus on relevant sources of information for the given learning objective. The proposed attention module is incorporated into the recently proposed Neural Bag of Feature (NBoF) model to enhance its learning capacity. Since 2DA acts as a plug-in layer, injecting it into different computation stages of the NBoF model results in different 2DA-NBoF architectures, each of which possesses a unique interpretation. We conducted extensive experiments in financial forecasting, audio analysis as well as medical diagnosis problems to benchmark the proposed formulations in comparison with existing methods, including the widely used Gated Recurrent Units. Our empirical analysis shows that the proposed attention formulations can not only improve performances of NBoF models but also make them resilient to noisy data.