Detailed Information

Cited 18 time in webofscience Cited 24 time in scopus
Metadata Downloads

SlimDeblurGAN-Based Motion Deblurring and Marker Detection for Autonomous Drone Landing

Full metadata record
DC Field Value Language
dc.contributor.authorNoi Quang Truong-
dc.contributor.authorLee, Young Won-
dc.contributor.authorOwais, Muhammad-
dc.contributor.authorDat Tien Nguyen-
dc.contributor.authorBatchuluun, Ganbayar-
dc.contributor.authorTuyen Danh Pham-
dc.contributor.authorPark, Kang Ryoung-
dc.date.accessioned2024-08-08T06:01:12Z-
dc.date.available2024-08-08T06:01:12Z-
dc.date.issued2020-07-
dc.identifier.issn1424-8220-
dc.identifier.issn1424-3210-
dc.identifier.urihttps://scholarworks.dongguk.edu/handle/sw.dongguk/18727-
dc.description.abstractDeep learning-based marker detection for autonomous drone landing is widely studied, due to its superior detection performance. However, no study was reported to address non-uniform motion-blurred input images, and most of the previous handcrafted and deep learning-based methods failed to operate with these challenging inputs. To solve this problem, we propose a deep learning-based marker detection method for autonomous drone landing, by (1) introducing a two-phase framework of deblurring and object detection, by adopting a slimmed version of deblur generative adversarial network (DeblurGAN) model and a You only look once version 2 (YOLOv2) detector, respectively, and (2) considering the balance between the processing time and accuracy of the system. To this end, we propose a channel-pruning framework for slimming the DeblurGAN model called SlimDeblurGAN, without significant accuracy degradation. The experimental results on the two datasets showed that our proposed method exhibited higher performance and greater robustness than the previous methods, in both deburring and marker detection.-
dc.format.extent33-
dc.language영어-
dc.language.isoENG-
dc.publisherMDPI-
dc.titleSlimDeblurGAN-Based Motion Deblurring and Marker Detection for Autonomous Drone Landing-
dc.typeArticle-
dc.publisher.location스위스-
dc.identifier.doi10.3390/s20143918-
dc.identifier.scopusid2-s2.0-85087868620-
dc.identifier.wosid000554124700001-
dc.identifier.bibliographicCitationSENSORS, v.20, no.14, pp 1 - 33-
dc.citation.titleSENSORS-
dc.citation.volume20-
dc.citation.number14-
dc.citation.startPage1-
dc.citation.endPage33-
dc.type.docTypeArticle-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaChemistry-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaInstruments & Instrumentation-
dc.relation.journalWebOfScienceCategoryChemistry, Analytical-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryInstruments & Instrumentation-
dc.subject.keywordPlusDEBLURGAN-
dc.subject.keywordPlusOBJECT-
dc.subject.keywordPlusUAV-
dc.subject.keywordAuthorunmanned aerial vehicle-
dc.subject.keywordAuthorautonomous landing-
dc.subject.keywordAuthordeep-learning-based motion deblurring and marker detection-
dc.subject.keywordAuthornetwork slimming-
dc.subject.keywordAuthorpruning model-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Engineering > Department of Electronics and Electrical Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Related Researcher

Researcher Batchuluun, Ganbayar photo

Batchuluun, Ganbayar
College of Engineering (Department of Electronics and Electrical Engineering)
Read more

Altmetrics

Total Views & Downloads

BROWSE