Detailed Information

Cited 0 time in webofscience Cited 0 time in scopus
Metadata Downloads

Exploring Kolmogorov-Arnold Network Expansions in Vision Transformers for Mitigation of Catastrophic Forgetting in Continual Learning

Full metadata record
DC Field Value Language
dc.contributor.authorUllah, Zahid-
dc.contributor.authorKim, Jihie-
dc.date.accessioned2025-10-15T07:00:11Z-
dc.date.available2025-10-15T07:00:11Z-
dc.date.issued2025-09-
dc.identifier.issn2227-7390-
dc.identifier.issn2227-7390-
dc.identifier.urihttps://scholarworks.dongguk.edu/handle/sw.dongguk/61782-
dc.description.abstractContinual Learning (CL), the ability of a model to learn new tasks without forgetting previously acquired knowledge, remains a critical challenge in artificial intelligence. This is particularly true for Vision Transformers (ViTs) that utilize Multilayer Perceptrons (MLPs) for global representation learning. Catastrophic forgetting, where new information overwrites prior knowledge, is especially problematic in these models. This research proposes the replacement of MLPs in ViTs with Kolmogorov-Arnold Networks (KANs) to address this issue. KANs leverage local plasticity through spline-based activations, ensuring that only a subset of parameters is updated per sample, thereby preserving previously learned knowledge. This study investigates the efficacy of KAN-based ViTs in CL scenarios across various benchmark datasets (MNIST, CIFAR100, and TinyImageNet-200), focusing on this approach's ability to retain accuracy on earlier tasks while adapting to new ones. Our experimental results demonstrate that KAN-based ViTs significantly mitigate catastrophic forgetting, outperforming traditional MLP-based ViTs in both knowledge retention and task adaptation. This novel integration of KANs into ViTs represents a promising step toward more robust and adaptable models for dynamic environments.-
dc.format.extent29-
dc.language영어-
dc.language.isoENG-
dc.publisherMDPI-
dc.titleExploring Kolmogorov-Arnold Network Expansions in Vision Transformers for Mitigation of Catastrophic Forgetting in Continual Learning-
dc.typeArticle-
dc.publisher.location스위스-
dc.identifier.doi10.3390/math13182988-
dc.identifier.scopusid2-s2.0-105017254710-
dc.identifier.wosid001580514400001-
dc.identifier.bibliographicCitationMathematics, v.13, no.18, pp 1 - 29-
dc.citation.titleMathematics-
dc.citation.volume13-
dc.citation.number18-
dc.citation.startPage1-
dc.citation.endPage29-
dc.type.docTypeArticle-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaMathematics-
dc.relation.journalWebOfScienceCategoryMathematics-
dc.subject.keywordAuthorKolmogorov-Arnold network-
dc.subject.keywordAuthorcontinual learning-
dc.subject.keywordAuthorcatastrophic forgetting-
dc.subject.keywordAuthorVision Transformers-
dc.subject.keywordAuthordeep learning-
Files in This Item
There are no files associated with this item.
Appears in
Collections
ETC > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Ullah, Zahid photo

Ullah, Zahid
College of Advanced Convergence Engineering (Department of Computer Science and Artificial Intelligence)
Read more

Altmetrics

Total Views & Downloads

BROWSE