• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2025, Vol. 47 ›› Issue (10): 1737-1744.

• High Performance Computing • Previous Articles     Next Articles

A spiking neural network accelerator based on approximate computing

XU Weikang,SUN Yan,ZHANG Jianmin   

  1. (College of Computer Science and Technology,National University of Defense Technology,Changsha 410073,China)
  • Online:2025-10-25 Published:2025-10-28

Abstract: Spiking neural network (SNN) achieves a closer simulation of biological neurons, and their high energy efficiency makes them exceptionally suitable for edge and end-device computing scena- rios. However, in applications highly sensitive to power consumption, further reducing power consumption  remains a crucial objective. Approximate computing simplifies design by introducing a certain degree of error, offering new opportunities for energy-efficient hardware design for fault-tolerant applications. This paper explores methods for applying approximate computing to SNN accelerators. Firstly, through analysis and experiments tailored to the application characteristics of SNNs, the distribution characteristics of input data for numerous adders in SNN accelerators are summarized. Based on these characteristics, an application-sensitive error evaluation metric for approximate arithmetic components, named AARE (application-aware approximation error), is proposed. By using this metric and the optimal approximate adder selection strategy introduced in this paper, more appropriate approximate arithmetic components can be selected for specific applications. Building on this, an approximate computing-based SNN hardware accelerator, AxSpike, is implemented using open-source EDA tools and PDKs, along with a corresponding simulator developed using snnTorch. Experimental results demonstrate that the accelerator achieves a 37.32% reduction in power consumption and a 31.26% reduction in area, with only a 3.47 percen- tage point decrease in accuracy, significantly enhancing the energy efficiency of SNN hardware accelerators.


Key words: approximate computing, spiking neural network, hardware accelerator, high energy efficiency, low power consumption