Learning deep representations via extreme learning machines
Yu, Wenchao1,2; Zhuang, Fuzhen1,2; He, Qing2; Shi, Zhongzhi2
刊名NEUROCOMPUTING
2015-02-03
卷号149页码:308-315
关键词Extreme learning machine Deep learning Representation learning Stacked ELMs Stacked generalization DrELM
ISSN号0925-2312
DOI10.1016/j.neucom.2014.03.077
英文摘要Extreme learning machine (ELM) as an emerging technology has achieved exceptional performance in large-scale settings, and is well suited to binary and multi-class classification, as well as regression tasks. However, existing ELM and its variants predominantly employ single hidden layer feedforward networks, leaving the popular and potentially powerful stacked generalization principle unexploited for seeking predictive deep representations of input data. Deep architectures can find higher-level representations, thus can potentially capture relevant higher-level abstractions. But most of current deep learning methods require solving a difficult and non-convex optimization problem. In this paper, we propose a stacked model, DrELM, to learn deep representations via extreme learning machine according to stacked generalization philosophy. The proposed model utilizes ELM as a base building block and incorporates random shift and kernelization as stacking elements. Specifically, in each layer. DrELM integrates a random projection of the predictions obtained by ELM into the original feature, and then applies kernel functions to generate the resultant feature. To verify the classification and regression performance of DrELM, we conduct the experiments on both synthetic and real-world data sets. The experimental results show that DrELM outperforms ELM and kernel ELMs, which appear to demonstrate that DrELM could yield predictive features that are suitable for prediction tasks. The performances of the deep models (i.e. Stacked Auto-encoder) are comparable. However, due to the utilization of ELM, DrELM is easier to learn and faster in testing. (C) 2014 Elsevier B.V. All rights reserved.
资助项目National Natural Science Foundation of China[6117505261203297] ; National Natural Science Foundation of China[60933004] ; National Natural Science Foundation of China[61035003] ; National High-tech R&D Program of China (863 Program)[2013AA01A606] ; National High-tech R&D Program of China (863 Program)[2012AA011003] ; National Program on Key Basic Research Project (973 Program)[2013CB329502]
WOS研究方向Computer Science
语种英语
出版者ELSEVIER SCIENCE BV
WOS记录号WOS:000360028800037
内容类型期刊论文
源URL[http://119.78.100.204/handle/2XEOYT63/9398]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Yu, Wenchao
作者单位1.Univ Chinese Acad Sci, Beijing 100049, Peoples R China
2.Chinese Acad Sci, Key Lab Intelligent Informat Proc, Inst Comp Technol, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Yu, Wenchao,Zhuang, Fuzhen,He, Qing,et al. Learning deep representations via extreme learning machines[J]. NEUROCOMPUTING,2015,149:308-315.
APA Yu, Wenchao,Zhuang, Fuzhen,He, Qing,&Shi, Zhongzhi.(2015).Learning deep representations via extreme learning machines.NEUROCOMPUTING,149,308-315.
MLA Yu, Wenchao,et al."Learning deep representations via extreme learning machines".NEUROCOMPUTING 149(2015):308-315.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace