Cited 16 time in
Finger-Vein Recognition Using Heterogeneous Databases by Domain Adaption Based on a Cycle-Consistent Adversarial Network
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Noh, Kyoung Jun | - |
| dc.contributor.author | Choi, Jiho | - |
| dc.contributor.author | Hong, Jin Seong | - |
| dc.contributor.author | Park, Kang Ryoung | - |
| dc.date.accessioned | 2024-08-08T05:31:29Z | - |
| dc.date.available | 2024-08-08T05:31:29Z | - |
| dc.date.issued | 2021-01 | - |
| dc.identifier.issn | 1424-8220 | - |
| dc.identifier.issn | 1424-3210 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/18709 | - |
| dc.description.abstract | The conventional finger-vein recognition system is trained using one type of database and entails the serious problem of performance degradation when tested with different types of databases. This degradation is caused by changes in image characteristics due to variable factors such as position of camera, finger, and lighting. Therefore, each database has varying characteristics despite the same finger-vein modality. However, previous researches on improving the recognition accuracy of unobserved or heterogeneous databases is lacking. To overcome this problem, we propose a method to improve the finger-vein recognition accuracy using domain adaptation between heterogeneous databases using cycle-consistent adversarial networks (CycleGAN), which enhances the recognition accuracy of unobserved data. The experiments were performed with two open databases-Shandong University homologous multi-modal traits finger-vein database (SDUMLA-HMT-DB) and Hong Kong Polytech University finger-image database (HKPolyU-DB). They showed that the equal error rate (EER) of finger-vein recognition was 0.85% in case of training with SDUMLA-HMT-DB and testing with HKPolyU-DB, which had an improvement of 33.1% compared to the second best method. The EER was 3.4% in case of training with HKPolyU-DB and testing with SDUMLA-HMT-DB, which also had an improvement of 4.8% compared to the second best method. | - |
| dc.format.extent | 28 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | MDPI | - |
| dc.title | Finger-Vein Recognition Using Heterogeneous Databases by Domain Adaption Based on a Cycle-Consistent Adversarial Network | - |
| dc.type | Article | - |
| dc.publisher.location | 스위스 | - |
| dc.identifier.doi | 10.3390/s21020524 | - |
| dc.identifier.scopusid | 2-s2.0-85099354389 | - |
| dc.identifier.wosid | 000612063400001 | - |
| dc.identifier.bibliographicCitation | SENSORS, v.21, no.2, pp 1 - 28 | - |
| dc.citation.title | SENSORS | - |
| dc.citation.volume | 21 | - |
| dc.citation.number | 2 | - |
| dc.citation.startPage | 1 | - |
| dc.citation.endPage | 28 | - |
| dc.type.docType | Article | - |
| dc.description.isOpenAccess | Y | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Chemistry | - |
| dc.relation.journalResearchArea | Engineering | - |
| dc.relation.journalResearchArea | Instruments & Instrumentation | - |
| dc.relation.journalWebOfScienceCategory | Chemistry, Analytical | - |
| dc.relation.journalWebOfScienceCategory | Engineering, Electrical & Electronic | - |
| dc.relation.journalWebOfScienceCategory | Instruments & Instrumentation | - |
| dc.subject.keywordPlus | MATCHING METHOD | - |
| dc.subject.keywordPlus | EXTRACTION | - |
| dc.subject.keywordPlus | SENSOR | - |
| dc.subject.keywordAuthor | finger-vein recognition | - |
| dc.subject.keywordAuthor | camera position | - |
| dc.subject.keywordAuthor | finger position | - |
| dc.subject.keywordAuthor | lighting | - |
| dc.subject.keywordAuthor | unobserved database | - |
| dc.subject.keywordAuthor | heterogeneous database | - |
| dc.subject.keywordAuthor | domain adaptation | - |
| dc.subject.keywordAuthor | cycle-consistent adversarial networks | - |
| dc.subject.keywordAuthor | SDUMLA-HMT-DB | - |
| dc.subject.keywordAuthor | HKPolyU-DB | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
