• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

计算机工程与科学

• 高性能计算 • 上一篇    下一篇

基于Winograd稀疏算法的卷积神经网络加速器设计与研究

徐睿,马胜,郭阳,黄友,李艺煌   

  1. (国防科技大学计算机学院,湖南 长沙 410073)
  • 收稿日期:2018-12-11 修回日期:2019-04-08 出版日期:2019-09-25 发布日期:2019-09-25
  • 基金资助:

    国家自然科学基金(61672526);国防科技大学科研项目(ZK17-03-06)

A convolutional neural network accelerator
 based on Winograd-Sparse algorithm
 

XU Rui,MA Sheng,GUO Yang,HUANG You,LI Yi-huang   

  1. (School of Computer,National University of Defense Technology,Changsha 410073,China)
  • Received:2018-12-11 Revised:2019-04-08 Online:2019-09-25 Published:2019-09-25

摘要:

随着卷积神经网络得到愈加广泛的应用,针对其复杂运算的定制硬件加速器得到越来越多的重视与研究。但是,目前定制硬件加速器多采用传统的卷积算法,并且缺乏对神经网络稀疏性的支持,从而丧失了进一步改进硬件,提升硬件性能的空间。重新设计一款卷积神经网络加速器,该加速器基于Winograd稀疏算法,该算法被证明有效降低了卷积神经网络的计算复杂性,并可以很好地适应稀疏神经网络。通过硬件实现该算法,本文的设计可以在减少硬件资源的同时,获得相当大的计算效率。实验表明,相比于传统算法,该加速器设计方案将运算速度提升了近4.15倍;从乘法器利用率的角度出发,相比现有的其他方案,该方案将利用率最多提高了近9倍。
 
 

关键词: 卷积神经网络, 加速器, Winograd算法, 稀疏网络

Abstract:

As convolutional neural networks are widely used, more researchers pay attention to customized hardware accelerators to solve the problem of complex computation. However, current hardware accelerators mostly use the traditional convolution algorithm. There is lack of support for sparse neural networks, and there is little improvement space for these accelerators from an algorithmic perspective. We redesign a convolutional neural network accelerator based on the Winograd-Sparse algorithm, and we prove that it can effectively reduce the computational complexity of convolutional neural networks and also well adapts to sparse neural networks. A combination of hardware and the algorithm can achieve considerable computational efficiency while reducing hardware resources. Experiments show that compared with the traditional algorithm, our accelerator can improve the computation speed by nearly 4.15 times. From the perspective of multiplier utilization, we have increase the utilization rate by up to 9 times in comparison with other existing designs
 

Key words: convolutional neural network, accelerator, Winograd algorithm, sparse network