Dual-domain Deep Convolutional Neural Networks for Image Demoireing
- Authors
- Vien, An Gia; Park, Hyunkook; Lee, Chul
- Issue Date
- Jun-2020
- Publisher
- IEEE COMPUTER SOC
- Citation
- 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020), v.2020-June, pp 1934 - 1942
- Pages
- 9
- Indexed
- SCOPUS
- Journal Title
- 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS (CVPRW 2020)
- Volume
- 2020-June
- Start Page
- 1934
- End Page
- 1942
- URI
- https://scholarworks.dongguk.edu/handle/sw.dongguk/7199
- DOI
- 10.1109/CVPRW50498.2020.00243
- ISSN
- 2160-7508
- Abstract
- We develop deep convolutional neural networks (CNNs) for moire artifacts removal by exploiting the complex properties of moire patterns in multiple complementary domains, i.e., the pixel and frequency domains. In the pixel domain, we employ multi-scale features to remove the moire artifacts associated with specific frequency bands using multi-resolution feature maps. In the frequency domain, we design a network that processes discrete cosine transform (DCT) coefficients to remove moire artifacts. Next, we develop a dynamic filter generation network that learns dynamic blending filters. Finally, the results from the pixel and frequency domains are combined using the blending filters to yield moire-free images. In addition, we extend the proposed approach to arbitrary-length burst image demoireing. Specifically, we develop a new attention network to effectively extract useful information from each image in the burst and align them with the reference image. We demonstrate the effectiveness of the proposed demoireing algorithm by evaluating on the test set in the NTIRE 2020 Demoireing Challenge: Track 1 (Single image) and Track 2 (Burst).
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Advanced Convergence Engineering > Department of Computer Science and Artificial Intelligence > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.