论文标题
重要性驱动持续学习跨领域的细分
Importance Driven Continual Learning for Segmentation Across Domains
论文作者
论文摘要
神经网络不断学习和适应新任务的同时保留先验知识的能力对于许多应用程序至关重要。但是,当前的神经网络在接受新的神经网络训练时往往会忘记以前学习的任务,即,他们患有灾难性遗忘(CF)。持续学习(CL)的目的是减轻此问题,这与医疗应用尤其重要,在医疗应用中,存储和访问先前使用的敏感患者数据可能是不可行的。在这项工作中,我们提出了一种持续的学习方法,用于脑部分割,其中单个网络对来自不同领域的样本进行连续培训。我们以一种重要的方法为基础,并将其调整为医疗图像分割。特别是,我们引入了学习率正则化,以防止网络知识的损失。我们的结果表明,直接限制重要网络参数的适应显然会减少跨域分割的灾难性遗忘。
The ability of neural networks to continuously learn and adapt to new tasks while retaining prior knowledge is crucial for many applications. However, current neural networks tend to forget previously learned tasks when trained on new ones, i.e., they suffer from Catastrophic Forgetting (CF). The objective of Continual Learning (CL) is to alleviate this problem, which is particularly relevant for medical applications, where it may not be feasible to store and access previously used sensitive patient data. In this work, we propose a Continual Learning approach for brain segmentation, where a single network is consecutively trained on samples from different domains. We build upon an importance driven approach and adapt it for medical image segmentation. Particularly, we introduce learning rate regularization to prevent the loss of the network's knowledge. Our results demonstrate that directly restricting the adaptation of important network parameters clearly reduces Catastrophic Forgetting for segmentation across domains.