Cited 6 time in
sEMG-Based Gesture Recognition Using Temporal History
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Hong, Chaerin | - |
| dc.contributor.author | Park, Seongsik | - |
| dc.contributor.author | Kim, Keehoon | - |
| dc.date.accessioned | 2024-08-08T08:30:40Z | - |
| dc.date.available | 2024-08-08T08:30:40Z | - |
| dc.date.issued | 2023-09 | - |
| dc.identifier.issn | 0018-9294 | - |
| dc.identifier.issn | 1558-2531 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/20366 | - |
| dc.description.abstract | Surface electromyography (sEMG) patterns have been decoded using learning-based methods that determine complicated nonlinear decision boundaries. However, overlapping classes in sEMG pattern recognition still degrade the classification accuracy because they cannot be separated by the decision boundaries. We hypothesized that certain overlapping classes can be separated while tracing the temporal history of sEMG patterns. Therefore, a novel post-processing method is proposed to adjust classification errors using the separated patterns from the temporal history of overlapping classes. The proposed method confirms the confidence of the prediction result by calculating the instantaneous pattern separability for the sequential sEMG input. The prediction result with high separability pattern is considered to have a high confidence of being correct (reliable). This result is stored for adjusting the next sEMG input. When the subsequent prediction is identified as having low confidence (unreliable), the predicted result is adjusted using the stored reliable predicted results. The proposed method adds an adjustment step to an existing classifier (maximum likelihood classifier (MLC), k-nearest neighbor (KNN), and support vector machine (SVM)), such that it can be attached to the back-end regardless of the type of classifier. Ten subjects performed 13 types of hand gestures, including overlapping patterns. The overall classification accuracy was enhanced to 88.93%(+8.12%p, MLC), 91.31%(+7.68%p, KNN), and 99.65%(+11.63%p, SVM) after the implementation of the proposed post-processing. Additionally, a faster and more accurate gesture classification was achieved with accuracy enhancement before gesture completion as 85.62%(+4.23%p, MLC), 89.77%(+4.23%p, KNN), and 97.62%(+11.12%p, SVM). © 1964-2012 IEEE. | - |
| dc.format.extent | 12 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | IEEE | - |
| dc.title | sEMG-Based Gesture Recognition Using Temporal History | - |
| dc.type | Article | - |
| dc.publisher.location | 미국 | - |
| dc.identifier.doi | 10.1109/TBME.2023.3261336 | - |
| dc.identifier.scopusid | 2-s2.0-85151516138 | - |
| dc.identifier.wosid | 001061797000015 | - |
| dc.identifier.bibliographicCitation | IEEE Transactions on Biomedical Engineering, v.70, no.9, pp 2655 - 2666 | - |
| dc.citation.title | IEEE Transactions on Biomedical Engineering | - |
| dc.citation.volume | 70 | - |
| dc.citation.number | 9 | - |
| dc.citation.startPage | 2655 | - |
| dc.citation.endPage | 2666 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Engineering | - |
| dc.relation.journalWebOfScienceCategory | Engineering, Biomedical | - |
| dc.subject.keywordPlus | PATTERN-RECOGNITION | - |
| dc.subject.keywordPlus | STRATEGY | - |
| dc.subject.keywordPlus | SIGNALS | - |
| dc.subject.keywordPlus | SCHEME | - |
| dc.subject.keywordPlus | MOTION | - |
| dc.subject.keywordPlus | SYSTEM | - |
| dc.subject.keywordAuthor | gesture recognition | - |
| dc.subject.keywordAuthor | pattern recognition | - |
| dc.subject.keywordAuthor | post-processing | - |
| dc.subject.keywordAuthor | sequence | - |
| dc.subject.keywordAuthor | Surface electromyography (sEMG) | - |
| dc.subject.keywordAuthor | temporal history | - |
| dc.subject.keywordAuthor | transient analysis | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
