Title | Deep Content Guidance Network for Arbitrary Style Transfer |
Publication Type | Conference Paper |
Year of Publication | 2021 |
Authors | Shi, Di-Bo, Xie, Huan, Ji, Yi, Li, Ying, Liu, Chun-Ping |
Conference Name | 2021 International Joint Conference on Neural Networks (IJCNN) |
Date Published | jul |
Keywords | arbitrary style transfer, Correlation, Deep Content Guidance Network, Layout, maintenance engineering, Metrics, Neural networks, neural style transfer, Permutation loss, pubcrawl, resilience, Resiliency, Scalability, Semantics, Stacking, Transforms |
Abstract | Arbitrary style transfer refers to generate a new image based on any set of existing images. Meanwhile, the generated image retains the content structure of one and the style pattern of another. In terms of content retention and style transfer, the recent arbitrary style transfer algorithms normally perform well in one, but it is difficult to find a trade-off between the two. In this paper, we propose the Deep Content Guidance Network (DCGN) which is stacked by content guidance (CG) layers. And each CG layer involves one position self-attention (pSA) module, one channel self-attention (cSA) module and one content guidance attention (cGA) module. Specially, the pSA module extracts more effective content information on the spatial layout of content images and the cSA module makes the style representation of style images in the channel dimension richer. And in the non-local view, the cGA module utilizes content information to guide the distribution of style features, which obtains a more detailed style expression. Moreover, we introduce a new permutation loss to generalize feature expression, so as to obtain abundant feature expressions while maintaining content structure. Qualitative and quantitative experiments verify that our approach can transform into better stylized images than the state-of-the-art methods. |
DOI | 10.1109/IJCNN52387.2021.9533953 |
Citation Key | shi_deep_2021 |