Cited 8 time in
Estimation of Fine-Grained Foot Strike Patterns with Wearable Smartwatch Devices
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Joo, Hyeyeoun | - |
| dc.contributor.author | Kim, Hyejoo | - |
| dc.contributor.author | Ryu, Jeh-Kwang | - |
| dc.contributor.author | Ryu, Semin | - |
| dc.contributor.author | Lee, Kyoung-Min | - |
| dc.contributor.author | Kim, Seung-Chan | - |
| dc.date.accessioned | 2023-04-27T13:40:36Z | - |
| dc.date.available | 2023-04-27T13:40:36Z | - |
| dc.date.issued | 2022-02 | - |
| dc.identifier.issn | 1661-7827 | - |
| dc.identifier.issn | 1660-4601 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/3655 | - |
| dc.description.abstract | People who exercise may benefit or be injured depending on their foot striking (FS) style. In this study, we propose an intelligent system that can recognize subtle differences in FS patterns while walking and running using measurements from a wearable smartwatch device. Although such patterns could be directly measured utilizing pressure distribution of feet while striking on the ground, we instead focused on analyzing hand movements by assuming that striking patterns consequently affect temporal movements of the whole body. The advantage of the proposed approach is that FS patterns can be estimated in a portable and less invasive manner. To this end, first, we developed a wearable system for measuring inertial movements of hands and then conducted an experiment where participants were asked to walk and run while wearing a smartwatch. Second, we trained and tested the captured multivariate time series signals in supervised learning settings. The experimental results obtained demonstrated high and robust classification performances (weighted-average F1 score > 90%) when recent deep neural network models, such as 1D-CNN and GRUs, were employed. We conclude this study with a discussion of potential future work and applications that increase benefits while walking and running properly using the proposed approach. | - |
| dc.format.extent | 18 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | MDPI | - |
| dc.title | Estimation of Fine-Grained Foot Strike Patterns with Wearable Smartwatch Devices | - |
| dc.type | Article | - |
| dc.publisher.location | 스위스 | - |
| dc.identifier.doi | 10.3390/ijerph19031279 | - |
| dc.identifier.scopusid | 2-s2.0-85123728157 | - |
| dc.identifier.wosid | 000754912400001 | - |
| dc.identifier.bibliographicCitation | International Journal of Environmental Research and Public Health, v.19, no.3, pp 1 - 18 | - |
| dc.citation.title | International Journal of Environmental Research and Public Health | - |
| dc.citation.volume | 19 | - |
| dc.citation.number | 3 | - |
| dc.citation.startPage | 1 | - |
| dc.citation.endPage | 18 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | ssci | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Environmental Sciences & Ecology | - |
| dc.relation.journalResearchArea | Public, Environmental & Occupational Health | - |
| dc.relation.journalWebOfScienceCategory | Environmental Sciences | - |
| dc.relation.journalWebOfScienceCategory | Public, Environmental & Occupational Health | - |
| dc.subject.keywordPlus | ACTIVITY RECOGNITION | - |
| dc.subject.keywordPlus | FOOTSTRIKE PATTERN | - |
| dc.subject.keywordPlus | TOE WALKING | - |
| dc.subject.keywordPlus | GAIT | - |
| dc.subject.keywordPlus | SENSORS | - |
| dc.subject.keywordAuthor | healthcare wearables | - |
| dc.subject.keywordAuthor | deep sequence learning | - |
| dc.subject.keywordAuthor | fine-grained motion classification | - |
| dc.subject.keywordAuthor | activity monitoring | - |
| dc.subject.keywordAuthor | human activity recognition | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
