Arbitrary-Scale Texture Generation From Coarse-Grained Control

2022 
Existing deep-network based texture synthesis approaches all focus on fine-grained control of texture generation by synthesizing images from exemplars. Since the networks employed by most of these methods are always tied to individual exemplar textures, a large number of individual networks have to be trained when modeling various textures. In this paper, we propose to generate textures directly from coarse-grained control or high-level guidance, such as texture categories, perceptual attributes and semantic descriptions. We fulfill the task by parsing the generation process of a texture into the three-level Bayesian hierarchical model. A coarse-grained signal first determines a distribution over Markov random fields. Then a Markov random field is used to model the distribution of the final output textures. Finally, an output texture is generated from the sampled Markov random field distribution. At the bottom level of the Bayesian hierarchy, the isotropic and ergodic characteristics of the textures favor a construction that consists of a fully convolutional network. The proposed method integrates texture creation and texture synthesis into one pipeline for real-time texture generation, and enables users to readily obtain diverse textures with arbitrary scales from high-level guidance only. Extensive experiments demonstrate that the proposed method is capable of generating plausible textures that are faithful to user-defined control, and achieving impressive texture metamorphosis by interpolation in the learned texture manifold.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []