• 中国计算机学会会刊
  • 中国科技核心期刊
  • 中文核心期刊

Computer Engineering & Science ›› 2021, Vol. 43 ›› Issue (10): 1789-1795.

Previous Articles     Next Articles

Unsupervised image style transfer based on generating adversarial network

LAN Tian,XIN Yue-lan,YIN Xiao-fang,LIU Wei-ming,JIANG Xing-yu   

  1. (College of Physics & Electronic Information Engineering,Qinghai Normal University,Xining 810001,China)
  • Received:2020-05-11 Revised:2020-09-08 Accepted:2021-10-25 Online:2021-10-25 Published:2021-10-22

Abstract: Unsupervised transfer of image style is a very important and challenging problem in the field of computer vision. Unsupervised image style migration is intended to map images of a given class to similar images of other classes. In general, pairwise matching data sets are difficult to obtain, which greatly limits the transformation model of image style migration. Therefore, in order to avoid this limitation, this paper improves the existing unsupervised image style transfer method and adopts an improved cycle consistency adversarial network to conduct unsupervised image style transfer. Firstly, in order to improve the training speed of the network and avoid the phenomenon of gradient disappearing, this paper introduces Densenet network into the traditional cycle consistent network generator. In terms of improving the performance of generators, the generator network introduces the attention mechanism to output better images. In order to reduce the structural risk of the network, spectral normalization is used in each convolutional layer of the network. In order to verify the effectiveness of the proposed method, experiments are carried out on the datasets of monet2photo, vangogh2photo, and facades, 
the experimental results show that the average of Inception score and FID distance evaluation index are improved.


Key words: style transfer, generative adversarial network, attention mechanism, spectral normalization