ProxyMix: Proxy-based Mixup training with label refinery for source-free domain adaptation
Ding, Yuhe5; Sheng, Lijun1,2,3; Liang, Jian1,2,6; Zheng, Aihua4; He, Ran1,2,6
刊名NEURAL NETWORKS
2023-10-01
卷号167页码:92-103
关键词Source-free unsupervised domain adaptation Pseudo labeling
ISSN号0893-6080
DOI10.1016/j.neunet.2023.08.005
通讯作者Liang, Jian(liangjian92@gmail.com)
英文摘要Due to privacy concerns and data transmission issues, Source-free Unsupervised Domain Adaptation (SFDA) has gained popularity. It exploits pre-trained source models, rather than raw source data for target learning, to transfer knowledge from a labeled source domain to an unlabeled target domain. Existing methods solve this problem typically with additional parameters or noisy pseudo labels, and we propose an effective method named Proxy-based Mixup training with label refinery (ProxyMix) to avoid these drawbacks. To avoid additional parameters and leverages information in the source model, ProxyMix defines classifier weights as class prototypes and creates a class-balanced proxy source domain using nearest neighbors of the prototypes. To improve the reliability of pseudo labels, we further propose the frequency-weighted aggregation strategy to generate soft pseudo labels for unlabeled target data. Our strategy utilizes target features' internal structure, increases weights of low frequency class samples, and aligns the proxy and target domains using inter-and intra-domain mixup regularization. This mitigates the negative impact of noisy labels. Experiments on three 2D image and 3D point cloud object recognition benchmarks demonstrate that ProxyMix yields state-of-the-art performance for source-free UDA tasks. (c) 2023 Elsevier Ltd. All rights reserved.
资助项目National Natural Science Foundation of China[62276256] ; Beijing Nova Program, China[Z211100002121108] ; University Synergy Innovation Program of Anhui Province[GXXT-2022-036]
WOS研究方向Computer Science ; Neurosciences & Neurology
语种英语
出版者PERGAMON-ELSEVIER SCIENCE LTD
WOS记录号WOS:001068310300001
资助机构National Natural Science Foundation of China ; Beijing Nova Program, China ; University Synergy Innovation Program of Anhui Province
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/53089]  
专题多模态人工智能系统全国重点实验室
通讯作者Liang, Jian
作者单位1.Chinese Acad Sci CASIA, Inst Automat, Ctr Res Intelligent Percept & Comp, Beijing, Peoples R China
2.Chinese Acad Sci CASIA, State Key Lab Multimodal Artificial Intelligence S, Beijing, Peoples R China
3.Univ Sci & Technol China, Hefei, Peoples R China
4.Anhui Univ, Sch Artificial Intelligence, Hefei, Peoples R China
5.Anhui Univ, Sch Comp Sci & Technol, Hefei, Peoples R China
6.Univ Chinese Acad Sci, Sch Artificial Intelligence, Beijing, Peoples R China
推荐引用方式
GB/T 7714
Ding, Yuhe,Sheng, Lijun,Liang, Jian,et al. ProxyMix: Proxy-based Mixup training with label refinery for source-free domain adaptation[J]. NEURAL NETWORKS,2023,167:92-103.
APA Ding, Yuhe,Sheng, Lijun,Liang, Jian,Zheng, Aihua,&He, Ran.(2023).ProxyMix: Proxy-based Mixup training with label refinery for source-free domain adaptation.NEURAL NETWORKS,167,92-103.
MLA Ding, Yuhe,et al."ProxyMix: Proxy-based Mixup training with label refinery for source-free domain adaptation".NEURAL NETWORKS 167(2023):92-103.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace