Detailed Information

Cited 3 time in webofscience Cited 8 time in scopus
Metadata Downloads

CAM-CAN: Class activation map-based categorical adversarial network

Full metadata record
DC Field Value Language
dc.contributor.authorBatchuluun, Ganbayar-
dc.contributor.authorChoi, Jiho-
dc.contributor.authorPark, Kang Ryoung-
dc.date.accessioned2024-08-08T10:01:33Z-
dc.date.available2024-08-08T10:01:33Z-
dc.date.issued2023-07-
dc.identifier.issn0957-4174-
dc.identifier.issn1873-6793-
dc.identifier.urihttps://scholarworks.dongguk.edu/handle/sw.dongguk/21256-
dc.description.abstractNumerous studies have investigated image classification. In particular, recent methods based on deep learning have exhibited high accuracies. However, various existing state-of-the-art methods based on deep learning show different accuracies depending on the database and environment. Accordingly, different deep learning models need to be used in image classification studies according to the database, environment, and research field. This study investigated a technique to increase the accuracy of the existing deep learning-based models. The proposed method was applied to various existing state-of-the-art methods. In the proposed method, a convolution neural network (CNN) is trained using the classification activation map (CAM) to focus on specific areas in the input image. The CAM image is used as the ground-truth image. Furthermore, the concept of the CAM-based cate-gorical adversarial network (CAM-CAN), in which the CNN is trained based on a generative adversarial network, is proposed in this paper. An action recognition experiment was performed using the self-collected Dongguk thermal image database (DTh-DB) and open database, and the results revealed that the accuracies of the existing state-of-the-art methods significantly increased after applying the proposed method. For instance, the accuracies obtained using the DTh-DB, TPR, PPV, ACC, and F1 with the conventional DenseNet201 model were 80.14%, 75.28%, 96.0%, and 75.91%, respectively. After applying the proposed method, the accuracies increased to 86.53%, 89.90%, 97.64%, and 85.84%, respectively.-
dc.format.extent24-
dc.language영어-
dc.language.isoENG-
dc.publisherElsevier Ltd-
dc.titleCAM-CAN: Class activation map-based categorical adversarial network-
dc.typeArticle-
dc.publisher.location네델란드-
dc.identifier.doi10.1016/j.eswa.2023.119809-
dc.identifier.scopusid2-s2.0-85150041138-
dc.identifier.wosid000956597400001-
dc.identifier.bibliographicCitationExpert Systems with Applications, v.222, pp 1 - 24-
dc.citation.titleExpert Systems with Applications-
dc.citation.volume222-
dc.citation.startPage1-
dc.citation.endPage24-
dc.type.docTypeArticle-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaOperations Research & Management Science-
dc.relation.journalWebOfScienceCategoryComputer Science, Artificial Intelligence-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryOperations Research & Management Science-
dc.subject.keywordPlusRECOGNITION-
dc.subject.keywordAuthorDeep learning-
dc.subject.keywordAuthorCNN-
dc.subject.keywordAuthorCAM-CAN-
dc.subject.keywordAuthorGAN-
dc.subject.keywordAuthorAction recognition-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Engineering > Department of Electronics and Electrical Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Batchuluun, Ganbayar photo

Batchuluun, Ganbayar
College of Engineering (Department of Electronics and Electrical Engineering)
Read more

Altmetrics

Total Views & Downloads

BROWSE