Computer Engineering & Science ›› 2021, Vol. 43 ›› Issue (10): 1789-1795.
Previous Articles Next Articles
LAN Tian,XIN Yue-lan,YIN Xiao-fang,LIU Wei-ming,JIANG Xing-yu
Received:
Revised:
Accepted:
Online:
Published:
Abstract: Unsupervised transfer of image style is a very important and challenging problem in the field of computer vision. Unsupervised image style migration is intended to map images of a given class to similar images of other classes. In general, pairwise matching data sets are difficult to obtain, which greatly limits the transformation model of image style migration. Therefore, in order to avoid this limitation, this paper improves the existing unsupervised image style transfer method and adopts an improved cycle consistency adversarial network to conduct unsupervised image style transfer. Firstly, in order to improve the training speed of the network and avoid the phenomenon of gradient disappearing, this paper introduces Densenet network into the traditional cycle consistent network generator. In terms of improving the performance of generators, the generator network introduces the attention mechanism to output better images. In order to reduce the structural risk of the network, spectral normalization is used in each convolutional layer of the network. In order to verify the effectiveness of the proposed method, experiments are carried out on the datasets of monet2photo, vangogh2photo, and facades, the experimental results show that the average of Inception score and FID distance evaluation index are improved.
Key words: style transfer, generative adversarial network, attention mechanism, spectral normalization
LAN Tian, XIN Yue-lan, YIN Xiao-fang, LIU Wei-ming, JIANG Xing-yu. Unsupervised image style transfer based on generating adversarial network[J]. Computer Engineering & Science, 2021, 43(10): 1789-1795.
0 / / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://joces.nudt.edu.cn/EN/
http://joces.nudt.edu.cn/EN/Y2021/V43/I10/1789