Computer Engineering & Science ›› 2022, Vol. 44 ›› Issue (01): 118-123.
Previous Articles Next Articles
WANG Yang1,2,YU Zhen-xin1,2,LU Jia1,2
Received:
Revised:
Accepted:
Online:
Published:
Abstract: Performing style transfer on part of an image usually results in style overflow and insigni- ficant effects after stylization of smaller areas. Aiming at this problem, a style transfer algorithm for image saliency regions is proposed. Firstly, according to the characteristics of the human visual attention mechanism, the saliency regions in the training image data set are labeled, and the fast semantic segmentation model is used for training to obtain a binary mask image containing the saliency regions of the image. Then, by simplifying the network layer structure of the fast neural style transfer model, and adopting the instance regularization layer in the generating network part, a more realistic overall style transfer result is obtained. Finally, the binary mask image obtained by semantic segmentation is combined with the overall style transfer image, and the final result image is output. A comparative experiment was carried out on the Cityscapes dataset and the Microsoft COCO 2017 dataset. The results show that the local target area in the image is stylized uniformly and delicately, and can be well integrated with the background area. While a more realistic style transfer effect is achieved, operating efficiency is more dominant.
Key words: style transfer, saliency area detection, semantic segmentation, convolutional neural network
WANG Yang, YU Zhen-xin, LU Jia, . An image saliency area style transfer method combining visual attention mechanism[J]. Computer Engineering & Science, 2022, 44(01): 118-123.
0 / / Recommend
Add to citation manager EndNote|Ris|BibTeX
URL: http://joces.nudt.edu.cn/EN/
http://joces.nudt.edu.cn/EN/Y2022/V44/I01/118