Visible to the public A Simple Way of Multimodal and Arbitrary Style Transfer

TitleA Simple Way of Multimodal and Arbitrary Style Transfer
Publication TypeConference Paper
Year of Publication2019
AuthorsNguyen, A., Choi, S., Kim, W., Lee, S.
Conference NameICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date PublishedMay 2019
PublisherIEEE
ISBN Number978-1-4799-8131-1
Keywordsarbitrary style transfer, convolutional neural network, Deep Learning, Image coding, Image quality, image style transfer, Metrics, multimodal style transfer, neural style transfer, pubcrawl, resilience, Resiliency, Scalability, style encoding subnetwork, unimodal style transfer methods
Abstract

We re-define multimodality and introduce a simple approach to multimodal and arbitrary style transfer. Conventionally, style transfer methods are limited to synthesizing a deterministic output based on a single style, and there has been no work that can generate multiple images of various details, or multimodality, given a single style. In this work, we explore a way to achieve multimodal and arbitrary style transfer by injecting noise to a unimodal method. This novel approach does not require any trainable parameters, and can be readily applied to any unimodal style transfer methods with separate style encoding sub-network in literature. Experimental results show that while being able to transfer an image to multiple domains in various ways, the image quality is highly competitive with contemporary models in style transfer.

URLhttps://ieeexplore.ieee.org/document/8683493
DOI10.1109/ICASSP.2019.8683493
Citation Keynguyen_simple_2019