Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Analysis of Attention Modules in Unfolding Tensor Rank Minimization-Based Pansharpening

Authors
Phan, Dung VietVo, Chuong HoangLee, Chul
Issue Date
2025
Publisher
IEEE
Keywords
Attention; model-based deep learning; pansharpening; tensor rank minimization
Citation
2025 IEEE/IEIE International Conference on Consumer Electronics-Asia (ICCE-Asia)
Indexed
FOREIGN
Journal Title
2025 IEEE/IEIE International Conference on Consumer Electronics-Asia (ICCE-Asia)
URI
https://scholarworks.dongguk.edu/handle/sw.dongguk/63934
DOI
10.1109/ICCE-Asia67487.2025.11263658
Abstract
We examine the effect of various attention modules on a low-rank tensor minimization model for pansharpening. First, the pansharpening problem is formulated as a low-rank tensor minimization task, integrating a detail injection term and an attention module to guide the model to focus on salient regions of the feature map obtained by detail injection. Then, the problem is solved using a deep unfolding network, where each stage updates the variables and the regularizer via closed-form solutions and learned deep networks. Experimental results show that a simple and parameter-free attention module outperforms the baseline model. © 2025 IEEE.
Files in This Item
There are no files associated with this item.
Appears in
Collections
ETC > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Lee, Chul photo

Lee, Chul
College of Advanced Convergence Engineering (Department of Computer Science and Artificial Intelligence)
Read more

Altmetrics

Total Views & Downloads

BROWSE