Detailed Information

Cited 0 time in webofscience Cited 1 time in scopus
Metadata Downloads

Online Hand Gesture Recognition Using Semantically Interpretable Attention Mechanism

Full metadata record
DC Field Value Language
dc.contributor.authorChae, Moon Ju-
dc.contributor.authorHan, Sang Hoon-
dc.contributor.authorNam, Hyeok-
dc.contributor.authorPark, Jae Hyeon-
dc.contributor.authorCha, Min Hee-
dc.contributor.authorCho, Sung In-
dc.date.accessioned2025-03-12T05:00:12Z-
dc.date.available2025-03-12T05:00:12Z-
dc.date.issued2025-
dc.identifier.issn2169-3536-
dc.identifier.issn2169-3536-
dc.identifier.urihttps://scholarworks.dongguk.edu/handle/sw.dongguk/57923-
dc.description.abstractHand gesture recognition (HGR) is a field of action recognition widely used in various domains such as robotics, virtual reality (VR), and augmented reality (AR). In this paper, we propose a semantically interpretable attention technique based on the compression and exchange of local and global information for real-time dynamic hand gesture recognition. In this research, we focus on data comprising hand landmark coordinates and online recognition of multiple gestures within a single sequence. Specifically, our approach has two paths to learn intraframe and interframe information separately. The learned information is compressed in the local and global perspectives, and the compressed information is exchanged through cross-attention. By using this approach, the importance of each hand landmark and frame, which can be interpreted semantically, can be extracted, and this information is used in the attention process on the intraframe and interframe information. Finally, the intraframe and interframe information to which attention is applied is integrated, which effectively enables comprehensive feature extraction of both local and global information. Experimental results demonstrated that the proposed method enabled concise and rapid hand-gesture recognition. It provided 95% accuracy in real-time hand-gesture recognition on a SHREC’22 dataset and accurately estimated the conclusion of a given gesture. Additionally, with a speed of approximately 294 frames per second (FPS), our model is well-suited for real-time systems, offering users immersive experience. This demonstrates its potential for effective application in real-world environments. © 2025 The Authors.-
dc.format.extent12-
dc.language영어-
dc.language.isoENG-
dc.publisherIEEE-
dc.titleOnline Hand Gesture Recognition Using Semantically Interpretable Attention Mechanism-
dc.typeArticle-
dc.publisher.location미국-
dc.identifier.doi10.1109/ACCESS.2025.3540721-
dc.identifier.scopusid2-s2.0-85218434740-
dc.identifier.wosid001492133300034-
dc.identifier.bibliographicCitationIEEE Access, v.13, pp 32329 - 32340-
dc.citation.titleIEEE Access-
dc.citation.volume13-
dc.citation.startPage32329-
dc.citation.endPage32340-
dc.type.docTypeArticle-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaTelecommunications-
dc.relation.journalWebOfScienceCategoryComputer Science, Information Systems-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryTelecommunications-
dc.subject.keywordAuthorcross-attention-
dc.subject.keywordAuthorHand gesture recognition-
dc.subject.keywordAuthorintraframe and interframe information-
dc.subject.keywordAuthoronline recognition-
Files in This Item
There are no files associated with this item.
Appears in
Collections
ETC > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE