Visible to the public Multi-Style Transfer Generative Adversarial Network for Text Images

TitleMulti-Style Transfer Generative Adversarial Network for Text Images
Publication TypeConference Paper
Year of Publication2021
AuthorsYuan, Honghui, Yanai, Keiji
Conference Name2021 IEEE 4th International Conference on Multimedia Information Processing and Retrieval (MIPR)
KeywordsConferences, Deep Learning, font translation, gan, generative adversarial networks, information processing, Metrics, multi-style, neural style transfer, pubcrawl, resilience, Resiliency, Scalability, style transfer, Task Analysis, text images
AbstractIn recent years, neural style transfer have shown impressive results in deep learning. In particular, for text style transfer, recent researches have successfully completed the transition from the text font domain to the text style domain. However, for text style transfer, multiple style transfer often requires learning many models, and generating multiple styles images of texts in a single model remains an unsolved problem. In this paper, we propose a multiple style transformation network for text style transfer, which can generate multiple styles of text images in a single model and control the style of texts in a simple way. The main idea is to add conditions to the transfer network so that all the styles can be trained effectively in the network, and to control the generation of each text style through the conditions. We also optimize the network so that the conditional information can be transmitted effectively in the network. The advantage of the proposed network is that multiple styles of text can be generated with only one model and that it is possible to control the generation of text styles. We have tested the proposed network on a large number of texts, and have demonstrated that it works well when generating multiple styles of text at the same time.
DOI10.1109/MIPR51284.2021.00017
Citation Keyyuan_multi-style_2021