Biblio
Filters: Keyword is style loss [Clear All Filters]
Neural Style Transfer Using VGG19 and Alexnet. 2021 International Conference on Advancements in Electrical, Electronics, Communication, Computing and Automation (ICAECA). :1—6.
.
2021. Art is the perfect way for people to express their emotions in a way that words are unable to do. By simply looking at art, we can understand a person’s creativity and thoughts. In former times, artists spent a great deal of time creating an image of varied styles. In the current deep learning era, we are able to create images of different styles as we prefer within a short period of time. Neural style transfer is the most popular and widely used deep learning application that applies the desired style to the content image, which in turn generates an output image that is a combination of both style and the content of the original image. In this paper we have implemented the neural style transfer model with two architectures namely Vgg19 and Alexnet. This paper compares the output-styled image and the total loss obtained through VGG19 and Alexnet architectures. In addition, three different activation functions are used to compare quality and total loss of output styled images within Alexnet architectures.
Neural Style Transfer for Picture with Gradient Gram Matrix Description. 2020 39th Chinese Control Conference (CCC). :7026–7030.
.
2020. Despite the high performance of neural style transfer on stylized pictures, we found that Gatys et al [1] algorithm cannot perfectly reconstruct texture style. Output stylized picture could emerge unsatisfied unexpected textures such like muddiness in local area and insufficient grain expression. Our method bases on original algorithm, adding the Gradient Gram description on style loss, aiming to strengthen texture expression and eliminate muddiness. To some extent our method lengthens the runtime, however, its output stylized pictures get higher performance on texture details, especially in the elimination of muddiness.