Cited 1 time in
Knowledge Distillation based Online Learning Methodology using Unlabeled Data Stream
| DC Field | Value | Language |
|---|---|---|
| dc.contributor.author | Seo, Sanghyun | - |
| dc.contributor.author | Park, Seongchul | - |
| dc.contributor.author | Jeong, Changhoon | - |
| dc.contributor.author | Kim, Juntae | - |
| dc.date.accessioned | 2023-04-28T10:41:22Z | - |
| dc.date.available | 2023-04-28T10:41:22Z | - |
| dc.date.issued | 2018-09-28 | - |
| dc.identifier.uri | https://scholarworks.dongguk.edu/handle/sw.dongguk/10016 | - |
| dc.description.abstract | In supervised learning, the performance of the learning model decreases with the change of time step due to concept drift caused by overfitting of the training data. As a methodology to mitigate such concept drift, an online learning methodology has been proposed that trains the learning model on continuously input data stream. In this paper, we proposed an online learning methodology in which teacher model continuously trains student model based on knowledge distillation theory. The teacher model generates the output distribution called soft label to make a label for the unlabeled data stream and the student model trained by the unlabeled data stream with the soft label from teacher model. Experimental results show that the proposed method has better performances such as classification accuracy than that of the batch learning model trained by labeled data stream only. | - |
| dc.format.extent | 4 | - |
| dc.language | 영어 | - |
| dc.language.iso | ENG | - |
| dc.publisher | ASSOC COMPUTING MACHINERY | - |
| dc.title | Knowledge Distillation based Online Learning Methodology using Unlabeled Data Stream | - |
| dc.type | Article | - |
| dc.publisher.location | 미국 | - |
| dc.identifier.doi | 10.1145/3278312.3278319 | - |
| dc.identifier.scopusid | 2-s2.0-85059877694 | - |
| dc.identifier.wosid | 000455364500013 | - |
| dc.identifier.bibliographicCitation | PROCEEDINGS OF THE 2018 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND MACHINE INTELLIGENCE (MLMI 2018), pp 68 - 71 | - |
| dc.citation.title | PROCEEDINGS OF THE 2018 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND MACHINE INTELLIGENCE (MLMI 2018) | - |
| dc.citation.startPage | 68 | - |
| dc.citation.endPage | 71 | - |
| dc.type.docType | Proceedings Paper | - |
| dc.description.isOpenAccess | N | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.relation.journalResearchArea | Computer Science | - |
| dc.relation.journalWebOfScienceCategory | Computer Science, Artificial Intelligence | - |
| dc.relation.journalWebOfScienceCategory | Computer Science, Theory & Methods | - |
| dc.subject.keywordAuthor | Online Learning | - |
| dc.subject.keywordAuthor | Knowledge Distillation | - |
| dc.subject.keywordAuthor | Knowledge Transfer | - |
| dc.subject.keywordAuthor | Concept Drift | - |
Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
30, Pildong-ro 1-gil, Jung-gu, Seoul, 04620, Republic of Korea+82-2-2260-3114
Copyright(c) 2023 DONGGUK UNIVERSITY. ALL RIGHTS RESERVED.
Certain data included herein are derived from the © Web of Science of Clarivate Analytics. All rights reserved.
You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.
