Detailed Information

Cited 0 time in webofscience Cited 1 time in scopus
Metadata Downloads

Neural Rendering-Based 3D Scene Style Transfer Method via Semantic Understanding Using a Single Style Imageopen access

Authors
Park, JisunCho, Kyungeun
Issue Date
Jul-2023
Publisher
MDPI
Keywords
3D style transfer; neural rendering; neural radiance fields; semantic feature matching
Citation
Mathematics, v.11, no.14, pp 1 - 18
Pages
18
Indexed
SCIE
SCOPUS
Journal Title
Mathematics
Volume
11
Number
14
Start Page
1
End Page
18
URI
https://scholarworks.dongguk.edu/handle/sw.dongguk/19169
DOI
10.3390/math11143243
ISSN
2227-7390
2227-7390
Abstract
In the rapidly emerging era of untact ("contact-free") technologies, the requirement for three-dimensional (3D) virtual environments utilized in virtual reality (VR)/augmented reality (AR) and the metaverse has seen significant growth, owing to their extensive application across various domains. Current research focuses on the automatic transfer of the style of rendering images within a 3D virtual environment using artificial intelligence, which aims to minimize human intervention. However, the prevalent studies on rendering-based 3D environment-style transfers have certain inherent limitations. First, the training of a style transfer network dedicated to 3D virtual environments demands considerable style image data. These data must align with viewpoints that closely resemble those of the virtual environment. Second, there was noticeable inconsistency within the 3D structures. Predominant studies often neglect 3D scene geometry information instead of relying solely on 2D input image features. Finally, style adaptation fails to accommodate the unique characteristics inherent in each object. To address these issues, we propose a novel approach: a neural rendering-based 3D scene-style conversion technique. This methodology employs semantic nearest-neighbor feature matching, thereby facilitating the transfer of style within a 3D scene while considering the distinctive characteristics of each object, even when employing a single style image. The neural radiance field enables the network to comprehend the geometric information of a 3D scene in relation to its viewpoint. Subsequently, it transfers style features by employing the unique features of a single style image via semantic nearest-neighbor feature matching. In an empirical context, our proposed semantic 3D scene style transfer method was applied to 3D scene style transfers for both interior and exterior environments. This application utilizes the replica, 3DFront, and Tanks and Temples datasets for testing. The results illustrate that the proposed methodology surpasses existing style transfer techniques in terms of maintaining 3D viewpoint consistency, style uniformity, and semantic coherence.
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Advanced Convergence Engineering > Department of Computer Science and Artificial Intelligence > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Cho, Kyung Eun photo

Cho, Kyung Eun
College of Advanced Convergence Engineering (Department of Computer Science and Artificial Intelligence)
Read more

Altmetrics

Total Views & Downloads

BROWSE