• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2021, Vol. 43 ›› Issue (11): 2043-2048.

Previous Articles     Next Articles

A deep neural network model compression method based on Adams shortcut connection

DU Peng,LI Chao,SHI Jian-ping,JIANG Lin#br#

#br#
  

  1. (Faculty of Science,Kunming University of Science and Technology,Kunming 650500,China)
  • Received:2020-04-03 Revised:2020-09-07 Accepted:2021-11-25 Online:2021-11-25 Published:2021-11-23

Abstract: Deep neural network has made a great breakthrough in all kinds of computer vision tasks, but there is still a lack of guiding principles in network structure design. A great deal of theoretical and empirical evidence shows that the depth of neural networks is the key to their success, but the trainability of neural networks remains to be solved. In this paper, the numerical method of differential equations (Adams) is used in the weight learning of deep neural network, and a shortcut connection based on Adams method is proposed to improve the learning accuracy of late network, compress the size of model, and make the model more effective. In particular, the trainability optimization effect is obvious for the deep neural network with a small number of layers. Taking the classic ResNet as an example, this paper compares the performance between Adams-ResNet, which uses the shortcut connection based on Adams method, and the source model on Cifar10. The former improves the recognition accuracy while reducing the parameters of the source model to half.

Key words: deep neural network, numerical method of differential equation, Adams method, shortcut connection