论文标题

将编码的情节存储为持续学习的概念

Storing Encoded Episodes as Concepts for Continual Learning

论文作者

Ayub, Ali, Wagner, Alan R.

论文摘要

持续学习方法面临的两个主要挑战是灾难性的遗忘和数据存储的记忆限制。为了应对这些挑战,我们提出了一种新颖的认知启发方法,该方法训练具有神经风格转移的自动编码器以编码和存储图像。在训练新任务的分类器模型以避免灾难性遗忘时,重建了编码情节的重建图像。重建图像的损耗函数是加权的,以减少分类器训练期间的效果,以应对图像降解。当系统用尽内存时,编码的情节将转换为质心和协方差矩阵,这些矩阵用于在分类器训练期间用于生成伪图像,从而使分类器性能保持稳定,并以较少的内存保持稳定。我们的方法比基准数据集的最先进方法提高了分类精度,同时需要少78%的存储空间。

The two main challenges faced by continual learning approaches are catastrophic forgetting and memory limitations on the storage of data. To cope with these challenges, we propose a novel, cognitively-inspired approach which trains autoencoders with Neural Style Transfer to encode and store images. Reconstructed images from encoded episodes are replayed when training the classifier model on a new task to avoid catastrophic forgetting. The loss function for the reconstructed images is weighted to reduce its effect during classifier training to cope with image degradation. When the system runs out of memory the encoded episodes are converted into centroids and covariance matrices, which are used to generate pseudo-images during classifier training, keeping classifier performance stable with less memory. Our approach increases classification accuracy by 13-17% over state-of-the-art methods on benchmark datasets, while requiring 78% less storage space.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源