Cited 13 time in
Locally Adaptive Channel Attention-Based Spatial-Spectral Neural Network for Image Deblurring
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Lee, Ho Sub | - |
| dc.contributor.author | Cho, Sung In | - |
| dc.date.accessioned | 2024-08-08T08:30:52Z | - |
| dc.date.available | 2024-08-08T08:30:52Z | - |
| dc.date.issued | 2023-10 | - |
| dc.identifier.issn | 1051-8215 | - |
| dc.identifier.issn | 1558-2205 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/20439 | - |
| dc.description.abstract | Recently, due to the rapid development of deep neural networks in the field of computer vision, many studies have been conducted in the field of image deblurring. However, previous methods cannot attain satisfactory deblurring performance in object boundaries and local regions containing image details in restored results. This paper proposes a new method that uses a locally adaptive channel attention module for a spectral-spatial network to resolve the problem of single-image deblurring. Unlike existing methods, our proposed method consists of a spectral restorer and spatial restorer that adopt a locally adaptive attention mechanism in the spectral-spatial domains. In addition, unlike a conventional spectral-spatial network that only considers the magnitude of the frequency coefficients, the proposed method uses both the magnitude and phase of the frequency coefficients in the training stage. Our locally adaptive channel attention spectral-spatial network can focus on informative channels that are closely related to blur artifacts. Specifically, the spatial restorer, which guides the intensity of the blur image to fit the ground truth by exploring the interdependencies of feature channels, can efficiently restore scene characteristics while the spectral restorer, which guides the magnitude and phase of the blur image's frequency coefficients to fit those of the ground truth by exploring the interdependencies of feature channels, fine-tunes the details of structures. The experimental results show that the proposed method outperformed the deblurring results compared to benchmark methods in terms of both qualitative evaluation and quantitative metrics. © 2023 IEEE. | - |
| dc.format.extent | 16 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | IEEE | - |
| dc.title | Locally Adaptive Channel Attention-Based Spatial-Spectral Neural Network for Image Deblurring | - |
| dc.type | Article | - |
| dc.publisher.location | 미국 | - |
| dc.identifier.doi | 10.1109/TCSVT.2023.3250509 | - |
| dc.identifier.scopusid | 2-s2.0-85149873608 | - |
| dc.identifier.wosid | 001082387900004 | - |
| dc.identifier.bibliographicCitation | IEEE Transactions on Circuits and Systems for Video Technology, v.33, no.10, pp 5375 - 5390 | - |
| dc.citation.title | IEEE Transactions on Circuits and Systems for Video Technology | - |
| dc.citation.volume | 33 | - |
| dc.citation.number | 10 | - |
| dc.citation.startPage | 5375 | - |
| dc.citation.endPage | 5390 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Engineering | - |
| dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
| dc.subject.keywordPlus | RECONSTRUCTION | - |
| dc.subject.keywordPlus | REMOVAL | - |
| dc.subject.keywordPlus | BLUR | - |
| dc.subject.keywordAuthor | channel attention | - |
| dc.subject.keywordAuthor | Image deblurring | - |
| dc.subject.keywordAuthor | spatial restorer | - |
| dc.subject.keywordAuthor | spectral restorer | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
