Cited 24 time in
SlimDeblurGAN-Based Motion Deblurring and Marker Detection for Autonomous Drone Landing
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Noi Quang Truong | - |
| dc.contributor.author | Lee, Young Won | - |
| dc.contributor.author | Owais, Muhammad | - |
| dc.contributor.author | Dat Tien Nguyen | - |
| dc.contributor.author | Batchuluun, Ganbayar | - |
| dc.contributor.author | Tuyen Danh Pham | - |
| dc.contributor.author | Park, Kang Ryoung | - |
| dc.date.accessioned | 2024-08-08T06:01:12Z | - |
| dc.date.available | 2024-08-08T06:01:12Z | - |
| dc.date.issued | 2020-07 | - |
| dc.identifier.issn | 1424-8220 | - |
| dc.identifier.issn | 1424-3210 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/18727 | - |
| dc.description.abstract | Deep learning-based marker detection for autonomous drone landing is widely studied, due to its superior detection performance. However, no study was reported to address non-uniform motion-blurred input images, and most of the previous handcrafted and deep learning-based methods failed to operate with these challenging inputs. To solve this problem, we propose a deep learning-based marker detection method for autonomous drone landing, by (1) introducing a two-phase framework of deblurring and object detection, by adopting a slimmed version of deblur generative adversarial network (DeblurGAN) model and a You only look once version 2 (YOLOv2) detector, respectively, and (2) considering the balance between the processing time and accuracy of the system. To this end, we propose a channel-pruning framework for slimming the DeblurGAN model called SlimDeblurGAN, without significant accuracy degradation. The experimental results on the two datasets showed that our proposed method exhibited higher performance and greater robustness than the previous methods, in both deburring and marker detection. | - |
| dc.format.extent | 33 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | MDPI | - |
| dc.title | SlimDeblurGAN-Based Motion Deblurring and Marker Detection for Autonomous Drone Landing | - |
| dc.type | Article | - |
| dc.publisher.location | 스위스 | - |
| dc.identifier.doi | 10.3390/s20143918 | - |
| dc.identifier.scopusid | 2-s2.0-85087868620 | - |
| dc.identifier.wosid | 000554124700001 | - |
| dc.identifier.bibliographicCitation | SENSORS, v.20, no.14, pp 1 - 33 | - |
| dc.citation.title | SENSORS | - |
| dc.citation.volume | 20 | - |
| dc.citation.number | 14 | - |
| dc.citation.startPage | 1 | - |
| dc.citation.endPage | 33 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Chemistry | - |
| dc.relation.journalResearchArea | Engineering | - |
| dc.relation.journalResearchArea | Instruments & Instrumentation | - |
| dc.relation.journalWebOfScienceCategory | Chemistry, Analytical | - |
| dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
| dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
| dc.subject.keywordPlus | DEBLURGAN | - |
| dc.subject.keywordPlus | OBJECT | - |
| dc.subject.keywordPlus | UAV | - |
| dc.subject.keywordAuthor | unmanned aerial vehicle | - |
| dc.subject.keywordAuthor | autonomous landing | - |
| dc.subject.keywordAuthor | deep-learning-based motion deblurring and marker detection | - |
| dc.subject.keywordAuthor | network slimming | - |
| dc.subject.keywordAuthor | pruning model | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
