Detailed Information

Cited 0 time in webofscience Cited 1 time in scopus
Metadata Downloads

Textured Mesh Generation Using Multi-View and Multi-Source Supervision and Generative Adversarial Networks

Full metadata record
DC Field Value Language
dc.contributor.authorWen, Mingyun-
dc.contributor.authorPark, Jisun-
dc.contributor.authorCho, Kyungeun-
dc.date.accessioned2023-04-27T15:40:31Z-
dc.date.available2023-04-27T15:40:31Z-
dc.date.issued2021-11-
dc.identifier.issn2072-4292-
dc.identifier.issn2072-4292-
dc.identifier.urihttps://scholarworks.dongguk.edu/handle/sw.dongguk/4254-
dc.description.abstractThis study focuses on reconstructing accurate meshes with high-resolution textures from single images. The reconstruction process involves two networks: a mesh-reconstruction network and a texture-reconstruction network. The mesh-reconstruction network estimates a deformation map, which is used to deform a template mesh to the shape of the target object in the input image, and a low-resolution texture. We propose reconstructing a mesh with a high-resolution texture by enhancing the low-resolution texture through use of the super-resolution method. The architecture of the texture-reconstruction network is like that of a generative adversarial network comprising a generator and a discriminator. During the training of the texture-reconstruction network, the discriminator must focus on learning high-quality texture predictions and to ignore the difference between the generated mesh and the actual mesh. To achieve this objective, we used meshes reconstructed using the mesh-reconstruction network and textures generated through inverse rendering to generate pseudo-ground-truth images. We conducted experiments using the 3D-Future dataset, and the results prove that our proposed approach can be used to generate improved three-dimensional (3D) textured meshes compared to existing methods, both quantitatively and qualitatively. Additionally, through our proposed approach, the texture of the output image is significantly improved.-
dc.language영어-
dc.language.isoENG-
dc.publisherMDPI-
dc.titleTextured Mesh Generation Using Multi-View and Multi-Source Supervision and Generative Adversarial Networks-
dc.typeArticle-
dc.publisher.location스위스-
dc.identifier.doi10.3390/rs13214254-
dc.identifier.scopusid2-s2.0-85118145666-
dc.identifier.wosid000721147100001-
dc.identifier.bibliographicCitationREMOTE SENSING, v.13, no.21-
dc.citation.titleREMOTE SENSING-
dc.citation.volume13-
dc.citation.number21-
dc.type.docTypeArticle-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaEnvironmental Sciences & Ecology-
dc.relation.journalResearchAreaGeology-
dc.relation.journalResearchAreaRemote Sensing-
dc.relation.journalResearchAreaImaging Science & Photographic Technology-
dc.relation.journalWebOfScienceCategoryEnvironmental Sciences-
dc.relation.journalWebOfScienceCategoryGeosciences, Multidisciplinary-
dc.relation.journalWebOfScienceCategoryRemote Sensing-
dc.relation.journalWebOfScienceCategoryImaging Science & Photographic Technology-
dc.subject.keywordAuthorsingle image textured mesh reconstruction-
dc.subject.keywordAuthorconvolutional neural networks-
dc.subject.keywordAuthorgenerative adversarial network-
dc.subject.keywordAuthorsuper-resolution-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Advanced Convergence Engineering > Department of Computer Science and Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Cho, Kyung Eun photo

Cho, Kyung Eun
College of Advanced Convergence Engineering (Department of Computer Science and Artificial Intelligence)
Read more

Altmetrics

Total Views & Downloads

BROWSE