Development of Real-Time Style Transfer for Video System
Title | Development of Real-Time Style Transfer for Video System |
Publication Type | Conference Paper |
Year of Publication | 2019 |
Authors | Cao, Y., Tang, Y. |
Conference Name | 2019 3rd International Conference on Circuits, System and Simulation (ICCSS) |
Date Published | June 2019 |
Publisher | IEEE |
ISBN Number | 978-1-7281-3657-8 |
Keywords | artificial system, artistic images, Artistic Style, computer machine, convolutional neural nets, convolutional neural networks, deep neural networks, defined time loss function, end-to-end system, fast style transformation, feed-forward CNN approach, feed-forward network, feedforward neural nets, high perceptual quality, Image reconstruction, Image resolution, Metrics, Neural Network, Neural networks, neural style transfer, optical flow, optical losses, perceptual loss, pubcrawl, real-time style transfer, Real-time Systems, relatively time-consuming, resilience, Resiliency, Scalability, Streaming media, style reconstruction loss, style transfer, video sequences, video signal processing, video style transfer, video stylization, video system |
Abstract | Re-drawing the image as a certain artistic style is considered to be a complicated task for computer machine. On the contrary, human can easily master the method to compose and describe the style between different images. In the past, many researchers studying on the deep neural networks had found an appropriate representation of the artistic style using perceptual loss and style reconstruction loss. In the previous works, Gatys et al. proposed an artificial system based on convolutional neural networks that creates artistic images of high perceptual quality. Whereas in terms of running speed, it was relatively time-consuming, thus it cannot apply to video style transfer. Recently, a feed-forward CNN approach has shown the potential of fast style transformation, which is an end-to-end system without hundreds of iteration while transferring. We combined the benefits of both approaches, optimized the feed-forward network and defined time loss function to make it possible to implement the style transfer on video in real time. In contrast to the past method, our method runs in real time with higher resolution while creating competitive visually pleasing and temporally consistent experimental results. |
URL | https://ieeexplore.ieee.org/document/8935613 |
DOI | 10.1109/CIRSYSSIM.2019.8935613 |
Citation Key | cao_development_2019 |
- Scalability
- optical flow
- optical losses
- perceptual loss
- pubcrawl
- real-time style transfer
- real-time systems
- relatively time-consuming
- resilience
- Resiliency
- neural style transfer
- Streaming media
- style reconstruction loss
- style transfer
- video sequences
- video signal processing
- video style transfer
- video stylization
- video system
- feed-forward CNN approach
- artistic images
- Artistic Style
- computer machine
- convolutional neural nets
- convolutional neural networks
- deep neural networks
- defined time loss function
- end-to-end system
- fast style transformation
- artificial system
- feed-forward network
- feedforward neural nets
- high perceptual quality
- Image reconstruction
- Image resolution
- Metrics
- neural network
- Neural networks