Command Palette
Search for a command to run...
Wenqi Xian; Patsorn Sangkloy; Varun Agrawal; Amit Raj; Jingwan Lu; Chen Fang; Fisher Yu; James Hays

Abstract
In this paper, we investigate deep image synthesis guided by sketch, color, and texture. Previous image synthesis methods can be controlled by sketch and color strokes but we are the first to examine texture control. We allow a user to place a texture patch on a sketch at arbitrary locations and scales to control the desired output texture. Our generative network learns to synthesize objects consistent with these texture suggestions. To achieve this, we develop a local texture loss in addition to adversarial and content loss to train the generative network. We conduct experiments using sketches generated from real images and textures sampled from a separate texture database and results show that our proposed algorithm is able to generate plausible images that are faithful to user controls. Ablation studies show that our proposed pipeline can generate more realistic images than adapting existing methods directly.
Code Repositories
Benchmarks
| Benchmark | Methodology | Metrics |
|---|---|---|
| image-reconstruction-on-edge-to-handbags | Xian et al._ | FID: 60.848 LPIPS: 0.171 |
| image-reconstruction-on-edge-to-shoes | Xian et al._ | FID: 44.762 LPIPS: 0.124 |
Build AI with AI
From idea to launch — accelerate your AI development with free AI co-coding, out-of-the-box environment and best price of GPUs.