Cited 0 time in
Motion Estimation Approach for UAV Controls using Bidirectional Two-Layer LSTMs
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Guo, Haitao | - |
| dc.contributor.author | Sung, Yunsick | - |
| dc.contributor.author | Kang, Jungho | - |
| dc.date.accessioned | 2023-04-28T05:42:41Z | - |
| dc.date.available | 2023-04-28T05:42:41Z | - |
| dc.date.issued | 2019-07 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/8650 | - |
| dc.description.abstract | With the widespread use of unmanned aerial vehicles (UAVs), there is an increasing demand for the development of their control technology. The key interaction technology between humans and UAVs needs to focus on the human body language, which comprises rich interactive information, as it is the most natural, intuitive, and easy to master approach of interpersonal communication for humans. Therefore, the research on human motion estimation for UAV control is of considerable practical significance. Recently, deep learning has made breakthroughs in speech, image recognition and, other fields, and has crushed the performance of traditional methods in many fields. However, in the field of human motion estimation, deep learning has been progressing slowly. To overcome the limitations of the traditional methods and explore the application of deep learning methods in the field of motion estimation, this study proposes a method to estimate human arm motion using deep learning networks. We proposed a bidirectional two-layer LSTM fusion network to estimate the forearms' motion according to the hand position measured by HTC Vive. The performance was verified using a real data set. The average Euclidean distance similarity can reach up to 56%. In comparison with the traditional methods, the proposed method demonstrated wider applicability and better performance. | - |
| dc.format.extent | 4 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | IEEE | - |
| dc.title | Motion Estimation Approach for UAV Controls using Bidirectional Two-Layer LSTMs | - |
| dc.type | Article | - |
| dc.publisher.location | 미국 | - |
| dc.identifier.doi | 10.1109/iThings/GreenCom/CPSCom/SmartData.2019.00083 | - |
| dc.identifier.scopusid | 2-s2.0-85074831517 | - |
| dc.identifier.wosid | 000579857700060 | - |
| dc.identifier.bibliographicCitation | 2019 INTERNATIONAL CONFERENCE ON INTERNET OF THINGS (ITHINGS) AND IEEE GREEN COMPUTING AND COMMUNICATIONS (GREENCOM) AND IEEE CYBER, PHYSICAL AND SOCIAL COMPUTING (CPSCOM) AND IEEE SMART DATA (SMARTDATA), pp 381 - 384 | - |
| dc.citation.title | 2019 INTERNATIONAL CONFERENCE ON INTERNET OF THINGS (ITHINGS) AND IEEE GREEN COMPUTING AND COMMUNICATIONS (GREENCOM) AND IEEE CYBER, PHYSICAL AND SOCIAL COMPUTING (CPSCOM) AND IEEE SMART DATA (SMARTDATA) | - |
| dc.citation.startPage | 381 | - |
| dc.citation.endPage | 384 | - |
| dc.type.docType | Proceedings Paper | - |
| dc.description.isOpenAccess | N | - |
| dc.relation.journalResearchArea | Computer Science | - |
| dc.relation.journalResearchArea | Science & Technology - Other Topics | - |
| dc.relation.journalResearchArea | Telecommunications | - |
| dc.relation.journalWebOfScienceCategory | Computer Science, Theory & Methods | - |
| dc.relation.journalWebOfScienceCategory | Green & Sustainable Science & Technology | - |
| dc.relation.journalWebOfScienceCategory | Telecommunications | - |
| dc.subject.keywordAuthor | HTC Vive | - |
| dc.subject.keywordAuthor | deep teaming | - |
| dc.subject.keywordAuthor | UAV control | - |
| dc.subject.keywordAuthor | motion estimation | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
