Few-Shot Learning via Feature Hallucination with Variational Inference | |
Luo QX(罗沁轩)1,3; Wang LF(汪凌峰)1,2; Lv JG(吕京国)4; Xiang SM(向世明)1,3; Pan CH(潘春洪)1 | |
2021-01 | |
会议日期 | 2021-1 |
会议地点 | 线上会议 |
英文摘要 | Deep learning has achieved huge success in the field of artificial intelligence, but the performance heavily depends on labeled data. Few-shot learning aims to make a model rapidly adapt to unseen classes with few labeled samples after training on a base dataset, and this is useful for tasks lacking labeled data such as medical image processing. Considering that the core problem of few-shot learning is the lack of samples, a straightforward solution to this issue is data augmentation. This paper proposes a generative model (VI-Net) based on a cosine-classifier baseline. Specifically, we construct a framework to learn to define a generating space for each category in the latent space based on few support samples. In this way, new feature vectors can be generated to help make the decision boundary of classifier sharper during the fine-tuning process. To evaluate the effectiveness of our proposed approach, we perform comparative experiments and ablation studies on mini-ImageNet and CUB. Experimental results show that VI-Net does improve performance compared with the baseline and obtains the state-of-the-art result among other augmentation-based methods. |
语种 | 英语 |
内容类型 | 会议论文 |
源URL | [http://ir.ia.ac.cn/handle/173211/44310] |
专题 | 自动化研究所_模式识别国家重点实验室_遥感图像处理团队 |
作者单位 | 1.NLPR, Institute of Automation, Chinese Academy of Sciences 2.Key Laboratory of Knowledge Automation for Industrial Processes, Ministry of Education 3.School of Artificial Intelligence, University of Chinese Academy of Sciences 4.School of Geomatics and Urban Spatial Informatics, Beijing University of Civil Engineering and Architecture |
推荐引用方式 GB/T 7714 | Luo QX,Wang LF,Lv JG,et al. Few-Shot Learning via Feature Hallucination with Variational Inference[C]. 见:. 线上会议. 2021-1. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论