Information-density Masking Strategy for Masked Image Modeling
He, Zhu3,4; Yang, Chen4; Guyue, Hu2; Shan, Yu1,3,4
2023-07
会议日期2023-7-9
会议地点澳大利亚布里斯班
英文摘要

Recent representation learning approaches mainly fall into two paradigms: contrastive learning (CL) and masked image modeling (MIM). Combining these two methods may boost the performance, but its learning process still heavily depends on the random masking strategy. We conjecture that the random masking may hinder learning the comprehensive relationship between concept and visual patches. To overcome these limitations, we propose an information-density masking (IDM) strategy for general visual transformers. Specifically, the IDM mask out the visual patches according to their activation values of attention maps. To obtain the attention maps before the reconstruction, a self-supervised training framework CAMAE is further proposed. In addition, in order to reduce the redundancy among different attention maps, we introduce a pattern-learning balance (PLB) sampling to adaptively adjust the learning progress in different attention spaces. Extensive experiments indicate that our method efficiently retains more comprehensive visual characteristics and achieves state-of-the-art performance.

会议录出版者The IEEE International Conference on Multimedia & Expo (ICME)
内容类型会议论文
源URL[http://ir.ia.ac.cn/handle/173211/52165]  
专题自动化研究所_脑网络组研究中心
通讯作者He, Zhu
作者单位1.School of Artificial Intelligence, University of Chinese Academy of Sciences(UCAS)
2.School of Computer Science and Engineering, Nanyang Technological University
3.School of Future Technology, University of Chinese Academy of Sciences(UCAS)
4.Brainnetome Center, National Laboratory of Pattern Recognition (NLPR),\\Institute of Automation, Chinese Academy of Sciences(CASIA)
推荐引用方式
GB/T 7714
He, Zhu,Yang, Chen,Guyue, Hu,et al. Information-density Masking Strategy for Masked Image Modeling[C]. 见:. 澳大利亚布里斯班. 2023-7-9.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace