VLP: A Survey on Vision-language Pre-training
Fei-Long Chen2,3
刊名Machine Intelligence Research
2023
卷号20期号:1页码:38-56
关键词Vision and language pre-training transformers multimodal learning representation learning
ISSN号2731-538X
DOI10.1007/s11633-022-1369-5
英文摘要In the past few years, the emergence of pre-training models has brought uni-modal fields such as computer vision (CV) and natural language processing (NLP) to a new era. Substantial works have shown that they are beneficial for downstream uni-modal tasks and avoid training a new model from scratch. So can such pre-trained models be applied to multi-modal tasks? Researchers have explored this problem and made significant progress. This paper surveys recent advances and new frontiers in vision-language pre-training (VLP), including image-text and video-text pre-training. To give readers a better overall grasp of VLP, we first review its recent advances in five aspects: feature extraction, model architecture, pre-training objectives, pre-training datasets, and downstream tasks. Then, we summarize the specific VLP models in detail. Finally, we discuss the new frontiers in VLP. To the best of our knowledge, this is the first survey focused on VLP. We hope that this survey can shed light on future research in the VLP field.
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/50899]  
专题自动化研究所_学术期刊_International Journal of Automation and Computing
作者单位1.School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China
2.School of Future Technology, University of Chinese Academy of Sciences, Beijing 100049, China
3.Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China
推荐引用方式
GB/T 7714
Fei-Long Chen. VLP: A Survey on Vision-language Pre-training[J]. Machine Intelligence Research,2023,20(1):38-56.
APA Fei-Long Chen.(2023).VLP: A Survey on Vision-language Pre-training.Machine Intelligence Research,20(1),38-56.
MLA Fei-Long Chen."VLP: A Survey on Vision-language Pre-training".Machine Intelligence Research 20.1(2023):38-56.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace