Goto

Collaborating Authors

 non-stationary texture




Generating Non-Stationary Textures using Self-Rectification

Zhou, Yang, Xiao, Rongjun, Lischinski, Dani, Cohen-Or, Daniel, Huang, Hui

arXiv.org Artificial Intelligence

This paper addresses the challenge of example-based non-stationary texture synthesis. We introduce a novel twostep approach wherein users first modify a reference texture using standard image editing tools, yielding an initial rough target for the synthesis. Subsequently, our proposed method, termed "self-rectification", automatically refines this target into a coherent, seamless texture, while faithfully preserving the distinct visual characteristics of the reference exemplar. Our method leverages a pre-trained diffusion network, and uses self-attention mechanisms, to gradually align the synthesized texture with the reference, ensuring the retention of the structures in the provided target. Through experimental validation, our approach exhibits exceptional proficiency in handling non-stationary textures, demonstrating significant advancements in texture synthesis when compared to existing state-of-the-art techniques. Code is available at https://github.com/xiaorongjun000/Self-Rectification


Artificial Intelligence Helps Designers Create Virtual Textures

#artificialintelligence

Researchers have created a new tool that could aid designers for video games, virtual reality and animation in making more realistic virtual textures. An international team of computer scientists is using an artificial intelligence-based technique called generative adversarial networks (GAN) to train a network to learn to expand small textures into larger ones that still resemble the original sample. "Our approach successfully deals with non-stationary textures without any high level or semantic description of the large-scale structure," Yang Zhou, lead author of the work and an assistant professor at Shenzhen University and Huazhong University of Science & Technology, said in a statement. "It can cope with very challenging textures, which, to our knowledge, no other existing method can handle. The results are realistic designs produced in high-resolution, efficiently, and at a much larger scale."