论文标题

用于多任务学习的专注任务交互网络

Attentive Task Interaction Network for Multi-Task Learning

论文作者

Sinodinos, Dimitrios, Armanfard, Narges

论文摘要

多任务学习(MTL)最近作为学习范式获得了很多知名度,可以改善每任务的性能,同时与单个任务学习相比,同时使用较少的每任务模型参数。关于MTL网络的最大挑战之一是如何在任务之间共享功能。为了应对这一挑战,我们提出了细心的任务交互网络(ATI-NET)。 ATI-NET使用每个任务的潜在特征的知识蒸馏,然后结合特征图,以向解码器提供改进的上下文化信息。这种新颖的方法将知识蒸馏引入基于注意力的多任务网络的表现优于最先进的MTL基线的状态,例如独立MTAN和PAD-NET,具有相同数量的模型参数。

Multitask learning (MTL) has recently gained a lot of popularity as a learning paradigm that can lead to improved per-task performance while also using fewer per-task model parameters compared to single task learning. One of the biggest challenges regarding MTL networks involves how to share features across tasks. To address this challenge, we propose the Attentive Task Interaction Network (ATI-Net). ATI-Net employs knowledge distillation of the latent features for each task, then combines the feature maps to provide improved contextualized information to the decoder. This novel approach to introducing knowledge distillation into an attention based multitask network outperforms state of the art MTL baselines such as the standalone MTAN and PAD-Net, with roughly the same number of model parameters.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源