Monet-Style Images Generation Using Recurrent Neural Networks

An automatic Monet-style images generation method using long short term memory recurrent neural network is proposed in this paper. The method shows that long short term memory recurrent neural network can learn the structure and characteristics of Monet’s

  • PDF / 1,499,239 Bytes
  • 7 Pages / 439.37 x 666.142 pts Page_size
  • 20 Downloads / 233 Views

DOWNLOAD

REPORT


School of Information, Yunnan University, Kunming 650091, Yunnan, China [email protected], [email protected] 2 School of Computer and Information, Souwest Forestry University, Kunming 650224, Yunnan, China

Abstract. An automatic Monet-style images generation method using long short term memory recurrent neural network is proposed in this paper. The method shows that long short term memory recurrent neural network can learn the structure and characteristics of Monet’s paintings properly by demonstrating its ability to generate impressionism-style images. With Monet’s paintings as input, similar style of images can be constructed using the proposed method iteratively. The experiment results indicate that the trained recurrent neural networks were able to generate Monet-style images with a small amount of training data. Keywords: Recurrent neural network memory  Non-photorealistic rendering



Style transfer



Long short term

1 Introduction Algorithmic painting-style image generation is a difficult and hot task that has been actively explored in non-photorealistic rendering field. Many common methods for algorithmic painting-style generation consist of constructing carefully engineered painting style features and rely on simple generation schemes, such as Markov models or graph-based energy minimization techniques. While these approaches are sometimes able to produce interesting compositions, the resulting painting pieces usually consist of repetitive sequences and lack painting style structures that are common in most art works. With the increase in computational resources and recent advancements in recurrent neural network (RNN) architectures, novel generation method may now be practical for large-scale painting-style generation. The most common recurrent neural network used for modeling long-term dependencies is the long short term memory (LSTM) network, introduced by Hochreiter and Schmidhuber [1]. LSTM is an RNN architecture designed to be better at storing and accessing information than standard RNN, and has recently given state-of-the-art results in a variety of sequence processing tasks, including text generation [2], speech recognition [3] and handwriting recognition [4]. The main goal of this paper is to demonstrate that RNN network can use its memory to generate complex, realistic sequences containing long-range structure. More specifically, LSTM network is used for the task of automatic Monet-style images generation. © Springer International Publishing Switzerland 2016 A. El Rhalibi et al. (Eds.): Edutainment 2016, LNCS 9654, pp. 205–211, 2016. DOI: 10.1007/978-3-319-40259-8_18

206

Y. Zhao and D. Xu

2 Related Work In this paper, we try to render new images with a given style, and this problem is usually approached in a branch of computer vision called non-photorealistic rendering (NPR) [5]. Conceptually most closely related are methods using texture transfer to achieve artistic style transfer [6, 7]. These previous approaches mainly rely on non-parametric techniques to directly manipulate the pixel