论文标题
通过离群值消除持续学习中的记忆种群
Memory Population in Continual Learning via Outlier Elimination
论文作者
论文摘要
灾难性的遗忘是学习新任务时忘记以前学习的任务的现象,是开发持续学习算法的主要障碍。减轻遗忘的一种流行方法是使用内存缓冲区,该存储缓冲区存储了以前学到的任务示例的子集,以供新任务培训。事实上填充内存的方法是通过随机选择以前的示例。但是,此过程可能会引入异常值或嘈杂的样本,从而损害模型的概括。本文介绍了内存离群值消除(MOE),这是一种通过从标签均质亚群中选择样本来识别和消除内存缓冲区中异常值的方法。我们表明,具有较高同质性的空间与更代表类分布的特征空间有关。实际上,如果MoE被不同标签的样品包围,则将其除去样品。我们证明了MOE对CIFAR-10,CIFAR-100和CORE50的有效性优于以前众所周知的记忆种群方法。
Catastrophic forgetting, the phenomenon of forgetting previously learned tasks when learning a new one, is a major hurdle in developing continual learning algorithms. A popular method to alleviate forgetting is to use a memory buffer, which stores a subset of previously learned task examples for use during training on new tasks. The de facto method of filling memory is by randomly selecting previous examples. However, this process could introduce outliers or noisy samples that could hurt the generalization of the model. This paper introduces Memory Outlier Elimination (MOE), a method for identifying and eliminating outliers in the memory buffer by choosing samples from label-homogeneous subpopulations. We show that a space with a high homogeneity is related to a feature space that is more representative of the class distribution. In practice, MOE removes a sample if it is surrounded by samples from different labels. We demonstrate the effectiveness of MOE on CIFAR-10, CIFAR-100, and CORe50, outperforming previous well-known memory population methods.