User-Controllable Multi-Texture Synthesis with Generative Adversarial Networks
Alanov, Aibek, Kochurov, Max, Volkhonskiy, Denis, Yashkov, Daniil, Burnaev, Evgeny, Vetrov, Dmitry
We propose a novel multi-texture synthesis model based on generative adversarial networks (GANs) with a user-controllable mechanism. The user control ability allows to explicitly specify the texture which should be generated by the model. This property follows from using an encoder part which learns a latent representation for each texture from the dataset. To ensure a dataset coverage, we use an adversarial loss function that penalizes for incorrect reproductions of a given texture. In experiments, we show that our model can learn descriptive texture manifolds for large datasets and from raw data such as a collection of high-resolution photos. Moreover, we apply our method to produce 3D textures and show that it outperforms existing baselines.
Apr-24-2019
- Country:
- Europe > Russia (0.14)
- North America > United States (0.14)
- Oceania > Australia (0.14)
- Genre:
- Research Report (0.64)
- Technology: