Symmetric-threshold ReLU for Fast and Nearly Lossless ANN-SNN Conversion
Jianing Han2
刊名Machine Intelligence Research
2023
卷号20期号:3页码:435-446
关键词Symmetric-threshold rectified linear unit (stReLU), deep spiking neural networks, artificial neural network-spiking neural network (ANN-SNN) conversion, lossless conversion, double thresholds
ISSN号2731-538X
DOI10.1007/s11633-022-1388-2
英文摘要The artificial neural network-spiking neural network (ANN-SNN) conversion, as an efficient algorithm for deep SNNs training, promotes the performance of shallow SNNs, and expands the application in various tasks. However, the existing conversion methods still face the problem of large conversion error within low conversion time steps. In this paper, a heuristic symmetric-threshold rectified linear unit (stReLU) activation function for ANNs is proposed, based on the intrinsically different responses between the integrate and-fire (IF) neurons in SNNs and the activation functions in ANNs. The negative threshold in stReLU can guarantee the conversion of negative activations, and the symmetric thresholds enable positive error to offset negative error between activation value and spike firing rate, thus reducing the conversion error from ANNs to SNNs. The lossless conversion from ANNs with stReLU to SNNs is demonstrated by theoretical formulation. By contrasting stReLU with asymmetric-threshold LeakyReLU and threshold ReLU, the effectiveness of symmetric thresholds is further explored. The results show that ANNs with stReLU can decrease the conversion error and achieve nearly lossless conversion based on the MNIST, Fashion-MNIST, and CIFAR10 datasets, with 6× to 250 speedup compared with other methods. Moreover, the comparison of energy consumption between ANNs and SNNs indicates that this novel conversion algorithm can also significantly reduce energy consumption.
内容类型期刊论文
源URL[http://ir.ia.ac.cn/handle/173211/51711]  
专题自动化研究所_学术期刊_International Journal of Automation and Computing
作者单位1.Zhejiang Lab, Hangzhou 311121, China
2.College of Computer Science and Technology, Zhejiang University, Hangzhou 310027, China
推荐引用方式
GB/T 7714
Jianing Han. Symmetric-threshold ReLU for Fast and Nearly Lossless ANN-SNN Conversion[J]. Machine Intelligence Research,2023,20(3):435-446.
APA Jianing Han.(2023).Symmetric-threshold ReLU for Fast and Nearly Lossless ANN-SNN Conversion.Machine Intelligence Research,20(3),435-446.
MLA Jianing Han."Symmetric-threshold ReLU for Fast and Nearly Lossless ANN-SNN Conversion".Machine Intelligence Research 20.3(2023):435-446.
个性服务
查看访问统计
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。


©版权所有 ©2017 CSpace - Powered by CSpace