Biblio
Filters: Author is Xie, Huan [Clear All Filters]
Deep Content Guidance Network for Arbitrary Style Transfer. 2021 International Joint Conference on Neural Networks (IJCNN). :1—8.
.
2021. Arbitrary style transfer refers to generate a new image based on any set of existing images. Meanwhile, the generated image retains the content structure of one and the style pattern of another. In terms of content retention and style transfer, the recent arbitrary style transfer algorithms normally perform well in one, but it is difficult to find a trade-off between the two. In this paper, we propose the Deep Content Guidance Network (DCGN) which is stacked by content guidance (CG) layers. And each CG layer involves one position self-attention (pSA) module, one channel self-attention (cSA) module and one content guidance attention (cGA) module. Specially, the pSA module extracts more effective content information on the spatial layout of content images and the cSA module makes the style representation of style images in the channel dimension richer. And in the non-local view, the cGA module utilizes content information to guide the distribution of style features, which obtains a more detailed style expression. Moreover, we introduce a new permutation loss to generalize feature expression, so as to obtain abundant feature expressions while maintaining content structure. Qualitative and quantitative experiments verify that our approach can transform into better stylized images than the state-of-the-art methods.