Signed Neuron with Memory: Towards Simple, Accurate and High-Efficient ANN-SNN Conversion

Signed Neuron with Memory: Towards Simple, Accurate and High-Efficient ANN-SNN Conversion

Yuchen Wang, Malu Zhang, Yi Chen, Hong Qu

Proceedings of the Thirty-First International Joint Conference on Artificial Intelligence
Main Track. Pages 2501-2508. https://doi.org/10.24963/ijcai.2022/347

Spiking Neural Networks (SNNs) are receiving increasing attention due to their biological plausibility and the potential for ultra-low-power event-driven neuromorphic hardware implementation. Due to the complex temporal dynamics and discontinuity of spikes, training SNNs directly usually suffers from high computing resources and a long training time. As an alternative, SNN can be converted from a pre-trained artificial neural network (ANN) to bypass the difficulty in SNNs learning. However, the existing ANN-to-SNN methods neglect the inconsistency of information transmission between synchronous ANNs and asynchronous SNNs. In this work, we first analyze how the asynchronous spikes in SNNs may cause conversion errors between ANN and SNN. To address this problem, we propose a signed neuron with memory function, which enables almost no accuracy loss during the conversion process, and maintains the properties of asynchronous transmission in the converted SNNs. We further propose a new normalization method, named neuron-wise normalization, to significantly shorten the inference latency in the converted SNNs. We conduct experiments on challenging datasets including CIFAR10 (95.44% top-1), CIFAR100 (78.3% top-1) and ImageNet (73.16% top-1). Experimental results demonstrate that the proposed method outperforms the state-of-the-art works in terms of accuracy and inference time. The code is available at https://github.com/ppppps/ANN2SNNConversion_SNM_NeuronNorm.
Keywords:
Humans and AI: Cognitive Modeling
Humans and AI: Applications
Humans and AI: Cognitive Systems