DeepIron: Predicting Unwarped Garment Texture from a Single Image

Computer Graphics Forum (Proc. Eurographics) 2024
Hyun-Song Kwon and Sung-Hee Lee

Abstract

Realistic reconstruction of 3D clothing from an image has wide applications, such as avatar creation and virtual try-on. This paper presents a novel framework that reconstructs the texture map for 3D garments from a single garment image with pose. Since 3D garments are effectively modeled by stitching 2D garment sewing patterns, our specific goal is to generate a texture image for the sewing patterns. A key component of our framework, the Texture Unwarper, infers the original texture image from the input garment image, which exhibits warping and occlusion of the garment due to the user’s body shape and pose. This is effectively achieved by translating between the input and output images by mapping the latent spaces of the two images. By inferring the unwarped original texture of the input garment, our method helps reconstruct 3D garment models that can show high-quality texture images realistically deformed for new poses. We validate the effectiveness of our approach through a comparison with other methods and ablation studies.

Download

[PAPER] – 27 MB

Fast Forward Video