Visible to the public Neural Style Transfer Using VGG19 and Alexnet

TitleNeural Style Transfer Using VGG19 and Alexnet
Publication TypeConference Paper
Year of Publication2021
AuthorsKavitha, S., Dhanapriya, B., Vignesh, G. Naveen, Baskaran, K.R.
Conference Name2021 International Conference on Advancements in Electrical, Electronics, Communication, Computing and Automation (ICAECA)
KeywordsActivation Function, AlexNet, ART, Automation, Computer architecture, Content Loss, convolution, Creativity, Deep Learning, input image, Metrics, neural style transfer, Output styled image, pubcrawl, Relu, resilience, Resiliency, Scalability, Sigmoid, style image, style loss, Tanh, telecommunication computing, Total Loss Convolution Neural Network, VGG19
AbstractArt is the perfect way for people to express their emotions in a way that words are unable to do. By simply looking at art, we can understand a person's creativity and thoughts. In former times, artists spent a great deal of time creating an image of varied styles. In the current deep learning era, we are able to create images of different styles as we prefer within a short period of time. Neural style transfer is the most popular and widely used deep learning application that applies the desired style to the content image, which in turn generates an output image that is a combination of both style and the content of the original image. In this paper we have implemented the neural style transfer model with two architectures namely Vgg19 and Alexnet. This paper compares the output-styled image and the total loss obtained through VGG19 and Alexnet architectures. In addition, three different activation functions are used to compare quality and total loss of output styled images within Alexnet architectures.
DOI10.1109/ICAECA52838.2021.9675723
Citation Keykavitha_neural_2021