Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization | |
Gao, Juan2; Liu, Xin-Wei3; Dai, Yu-Hong1,4; Huang, Yakui3; Gu, Junhua2 | |
刊名 | COMPUTATIONAL OPTIMIZATION AND APPLICATIONS |
2022-11-23 | |
页码 | 42 |
关键词 | Distributed non-convex optimization Machine learning Momentum methods Optimization algorithms Convergence rate |
ISSN号 | 0926-6003 |
DOI | 10.1007/s10589-022-00432-5 |
英文摘要 | We consider a distributed non-convex optimization problem of minimizing the sum of all local cost functions over a network of agents. This problem often appears in large-scale distributed machine learning, known as non-convex empirical risk minimization. In this paper, we propose two accelerated algorithms, named DSGT-HB and DSGT-NAG, which combine the distributed stochastic gradient tracking (DSGT) method with momentum accelerated techniques. Under appropriate assumptions, we prove that both algorithms sublinearly converge to a neighborhood of a first-order stationary point of the distributed non-convex optimization. Moreover, we derive the conditions under which DSGT-HB and DSGT-NAG achieve a network-independent linear speedup. Numerical experiments for a distributed non-convex logistic regression problem on real data sets and a deep neural network on the MNIST database show the superiorities of DSGT-HB and DSGT-NAG compared with DSGT. |
资助项目 | National Natural Science Foundation of China[12071108] ; National Natural Science Foundation of China[11671116] ; National Natural Science Foundation of China[12021001] ; National Natural Science Foundation of China[11991021] ; National Natural Science Foundation of China[11991020] ; National Natural Science Foundation of China[11971372] ; National Natural Science Foundation of China[11701137] ; Strategic Priority Research Program of CAS[XDA27000000] |
WOS研究方向 | Operations Research & Management Science ; Mathematics |
语种 | 英语 |
出版者 | SPRINGER |
WOS记录号 | WOS:000886800900001 |
内容类型 | 期刊论文 |
源URL | [http://ir.amss.ac.cn/handle/2S8OKBNM/60592] |
专题 | 中国科学院数学与系统科学研究院 |
通讯作者 | Liu, Xin-Wei |
作者单位 | 1.Univ Chinese Acad Sci, Sch Math Sci, Beijing 100049, Peoples R China 2.Hebei Univ Technol, Sch Artificial Intelligence, Tianjin 300401, Peoples R China 3.Hebei Univ Technol, Inst Math, Tianjin 300401, Peoples R China 4.Chinese Acad Sci, Acad Math & Syst Sci, ICMSEC, LSEC, Beijing 100190, Peoples R China |
推荐引用方式 GB/T 7714 | Gao, Juan,Liu, Xin-Wei,Dai, Yu-Hong,et al. Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization[J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS,2022:42. |
APA | Gao, Juan,Liu, Xin-Wei,Dai, Yu-Hong,Huang, Yakui,&Gu, Junhua.(2022).Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization.COMPUTATIONAL OPTIMIZATION AND APPLICATIONS,42. |
MLA | Gao, Juan,et al."Distributed stochastic gradient tracking methods with momentum acceleration for non-convex optimization".COMPUTATIONAL OPTIMIZATION AND APPLICATIONS (2022):42. |
个性服务 |
查看访问统计 |
相关权益政策 |
暂无数据 |
收藏/分享 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论