STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator
Song, Lili1,2; Wang, Ying1,2; Han, Yinhe1,2; Li, Huawei1,2; Cheng, Yuanqing1,2; Li, Xiaowei1,2
刊名IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS
2017-04-01
卷号25期号:4页码:1285-1296
关键词Approximate computing machine learning neural network spin toque transfer RAM (STT-RAM)
ISSN号1063-8210
DOI10.1109/TVLSI.2016.2644279
英文摘要Multilevel spin toque transfer RAM (STT-RAM) is a suitable storage device for energy-efficient neural network accelerators (NNAs), which relies on large-capacity on-chip memory to support brain-inspired large-scale learning models from conventional artificial neural networks to current popular deep convolutional neural networks. In this paper, we investigate the application of multilevel STT-RAM to general-purpose NNAs. First, the error-resilience feature of neural networks is leveraged to tolerate the read/write reliability issue in multilevel cell STT-RAM using approximate computing. The induced read/write failures at the expense of higher storage density can be effectively masked by a wide spectrum of NN applications with intrinsic forgiveness. Second, we present a precision-tunable STT-RAM buffer for the popular general-purpose NNA. The targeted STT-RAM memory design is able to transform between multiple working modes and adaptable to meet the varying quality constraint of approximate applications. Lastly, the reconfigurable STT-RAM buffer not only enables precision scaling in NNA but also provides adaptiveness to the demand for different learning models with distinct working-set sizes. Particularly, we demonstrate the concept of capacity/precision-tunable STT-RAM memory with the emerging reconfigurable deep NNA and elaborate on the data mapping and storage mode switching policy in STT-RAM memory to achieve the best energy efficiency of approximate computing.
资助项目National Natural Science Foundation of China[61504153] ; National Natural Science Foundation of China[61522406] ; National Natural Science Foundation of China[61532017] ; National Natural Science Foundation of China[61521092]
WOS研究方向Computer Science ; Engineering
语种英语
出版者IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
WOS记录号WOS:000398858800009
内容类型期刊论文
源URL[http://119.78.100.204/handle/2XEOYT63/7280]  
专题中国科学院计算技术研究所期刊论文_英文
通讯作者Wang, Ying
作者单位1.Chinese Acad Sci, Inst Comp Technol, State Key Lab Comp Architecture, Beijing 100190, Peoples R China
2.Univ Chinese Acad Sci, Beijing 100190, Peoples R China
推荐引用方式
GB/T 7714
Song, Lili,Wang, Ying,Han, Yinhe,et al. STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator[J]. IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS,2017,25(4):1285-1296.
APA Song, Lili,Wang, Ying,Han, Yinhe,Li, Huawei,Cheng, Yuanqing,&Li, Xiaowei.(2017).STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator.IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS,25(4),1285-1296.
MLA Song, Lili,et al."STT-RAM Buffer Design for Precision-Tunable General-Purpose Neural Network Accelerator".IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS 25.4(2017):1285-1296.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace