Prototype augmentation and self-supervision for incremental learning
Fei Zhu; Xu-Yao Zhang; Chuang Wang; Fei Yin; Cheng-Lin Liu
2021
会议日期June 19-25, 2021
会议地点Online (Nashville, United States)
英文摘要

Despite the impressive performance in many individual tasks, deep neural networks suffer from catastrophic forgetting when learning new tasks incrementally. Recently, various incremental learning methods have been proposed, and some approaches achieved acceptable performance relying on stored data or complex generative models. However, storing data from previous tasks is limited by memory or privacy issues, and generative models are usually unstable and inefficient in training. In this paper, we propose a simple non-exemplar based method named PASS, to address the catastrophic forgetting problem in incremental learning. On the one hand, we propose to memorize one class-representative prototype for each old class and adopt prototype augmentation (protoAug) in the deep feature space to maintain the decision boundary of previous tasks. On the other hand, we employ self-supervised learning (SSL) to learn more generalizable and transferable features for other tasks, which demonstrates the effectiveness of SSL in incremental learning. Experimental results on benchmark datasets show that our approach significantly outperforms non-exemplar based methods, and achieves comparable performance compared to exemplar based approaches.

内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/47480]  
专题自动化研究所_模式识别国家重点实验室_模式分析与学习团队
作者单位中科院自动化所
推荐引用方式
GB/T 7714
Fei Zhu,Xu-Yao Zhang,Chuang Wang,et al. Prototype augmentation and self-supervision for incremental learning[C]. 见:. Online (Nashville, United States). June 19-25, 2021.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace