• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science

Previous Articles     Next Articles

A convolutional neural network accelerator
 based on Winograd-Sparse algorithm
 

XU Rui,MA Sheng,GUO Yang,HUANG You,LI Yi-huang   

  1. (School of Computer,National University of Defense Technology,Changsha 410073,China)
  • Received:2018-12-11 Revised:2019-04-08 Online:2019-09-25 Published:2019-09-25

Abstract:

As convolutional neural networks are widely used, more researchers pay attention to customized hardware accelerators to solve the problem of complex computation. However, current hardware accelerators mostly use the traditional convolution algorithm. There is lack of support for sparse neural networks, and there is little improvement space for these accelerators from an algorithmic perspective. We redesign a convolutional neural network accelerator based on the Winograd-Sparse algorithm, and we prove that it can effectively reduce the computational complexity of convolutional neural networks and also well adapts to sparse neural networks. A combination of hardware and the algorithm can achieve considerable computational efficiency while reducing hardware resources. Experiments show that compared with the traditional algorithm, our accelerator can improve the computation speed by nearly 4.15 times. From the perspective of multiplier utilization, we have increase the utilization rate by up to 9 times in comparison with other existing designs
 

Key words: convolutional neural network, accelerator, Winograd algorithm, sparse network