MSEmbGAN : Multi-Stitch Embroidery Synthesis using Region-Aware Texture Generation


Xinrong Hu, Chen Yang, Fei Fang∗, Jin Huang, Ping Li, Member, IEEE, Bin Sheng, Member, IEEE, and Tong-Yee Lee, Senior Member, IEEE



Abstract

Convolutional neural networks (CNNs) are widely used for embroidery feature synthesis from images. However, they are still unable to predict diverse stitch types, which makes it difficult for the CNNs to effectively extract stitch features. In this paper, we propose a multi-stitch embroidery generative adversarial network (MSEmbGAN) that uses a region-aware texture generation sub-network to predict diverse embroidery features from images. To the best of our knowledge, our work is the first CNN-based generative adversarial network to succeed in this task. Our region-aware texture generation sub-network detects multiple regions in the input image using a stitch classifier and generates a stitch texture for each region based on its shape features. We also propose a colorization network with a color feature extractor, which helps achieve full image color consistency by requiring the color attributes of the output to closely resemble the input image. Because of the current lack of labeled embroidery image datasets, we provide a new multi-stitch embroidery dataset that is annotated with three single-stitch types and one multi-stitch type. Our dataset, which includes more than 30K high-quality multi-stitch embroidery images, more than 13K aligned content-embroidered images, and more than 17K unaligned images, is currently the largest embroidery dataset accessible, as far as we know. Quantitative and qualitative experimental results, including a qualitative user study, show that our MSEmbGAN outperforms current state-of-the-art embroidery synthesis and style-transfer methods on all evaluation indicators.




Introduction video
Video Download ( .mp4 / 121M )
Related works comparison
Quantitative evaluation
{{column}}
{{row[column]}}
{{row[column]}}
User study
{{column}}
{{row[column]}}
{{row[column]}}
We provide some input examples: