Method for Generating Panoramic Textures for 3D Face Reconstruction Based on the 3D Morphable Modelopen access
- Authors
- Hao, Shujia; Wen, Mingyun; Cho, Kyungeun
- Issue Date
- Oct-2022
- Publisher
- MDPI
- Keywords
- panoramic texture generation; deep learning; adversarial learning; image translation
- Citation
- Applied Sciences, v.12, no.19, pp 1 - 19
- Pages
- 19
- Indexed
- SCIE
SCOPUS
- Journal Title
- Applied Sciences
- Volume
- 12
- Number
- 19
- Start Page
- 1
- End Page
- 19
- URI
- https://scholarworks.dongguk.edu/handle/sw.dongguk/2502
- DOI
- 10.3390/app121910020
- ISSN
- 2076-3417
2076-3417
- Abstract
- Three-dimensional (3D) reconstruction techniques are playing an increasingly important role in education and entertainment. Real and recognizable avatars can enhance the immersion and interactivity of virtual systems. In 3D face modeling technology, face texture carries vital face recognition information. Therefore, this study proposes a panoramic 3D face texture generation method for 3D face reconstruction from a single 2D face image based on a 3D Morphable model (3DMM). Realistic and comprehensive panoramic facial textures can be obtained using generative networks as texture converters. Furthermore, we propose a low-cost method for generating face texture datasets for data collection. Experimental results show that the proposed method can generate panoramic face textures for 3D face meshes from a single image input, resulting in the final generation of textured 3D models that look realistic from different viewpoints.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Advanced Convergence Engineering > Department of Computer Science and Artificial Intelligence > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.