Detailed Information

Cited 11 time in webofscience Cited 17 time in scopus
Metadata Downloads

Facial Action Units for Training Convolutional Neural Networks

Full metadata record
DC Field Value Language
dc.contributor.authorTrinh Thi Doan Pham-
dc.contributor.authorWon, Chee Sun-
dc.date.accessioned2023-04-28T05:42:22Z-
dc.date.available2023-04-28T05:42:22Z-
dc.date.issued2019-
dc.identifier.issn2169-3536-
dc.identifier.urihttps://scholarworks.dongguk.edu/handle/sw.dongguk/8611-
dc.description.abstractThis paper deals with the problem of training convolutional neural networks (CNNs) with facial action units (AUs). In particular, we focus on the imbalance problem of the training datasets for facial emotion classification. Since training a CNN with an imbalanced dataset tends to yield a learning bias toward the major classes and eventually leads to deterioration in the classification accuracy, it is required to increase the number of training images for the minority classes to have evenly distributed training images over all classes. However, it is difficult to find the images with a similar facial emotion for the oversampling. In this paper, we propose to use the AU features to retrieve an image with a similar emotion. The query selection from the minority class and the AU-based retrieval processes repeat until the numbers of training data over all classes are balanced. Also, to improve the classification accuracy, the AU features are fused with the CNN features to train a support vector machine (SVM) for final classification. The experiments have been conducted on three imbalanced facial image datasets, RAF-DB, FER2013, and ExpW. The results demonstrate that the CNNs trained with the AU features improve the classification accuracy by 3%-4%.-
dc.format.extent9-
dc.language영어-
dc.language.isoENG-
dc.publisherIEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC-
dc.titleFacial Action Units for Training Convolutional Neural Networks-
dc.typeArticle-
dc.publisher.location미국-
dc.identifier.doi10.1109/ACCESS.2019.2921241-
dc.identifier.scopusid2-s2.0-85068350994-
dc.identifier.wosid000473769900001-
dc.identifier.bibliographicCitationIEEE ACCESS, v.7, pp 77816 - 77824-
dc.citation.titleIEEE ACCESS-
dc.citation.volume7-
dc.citation.startPage77816-
dc.citation.endPage77824-
dc.type.docTypeArticle-
dc.description.isOpenAccessY-
dc.description.journalRegisteredClassscie-
dc.description.journalRegisteredClassscopus-
dc.relation.journalResearchAreaComputer Science-
dc.relation.journalResearchAreaEngineering-
dc.relation.journalResearchAreaTelecommunications-
dc.relation.journalWebOfScienceCategoryComputer Science, Information Systems-
dc.relation.journalWebOfScienceCategoryEngineering, Electrical & Electronic-
dc.relation.journalWebOfScienceCategoryTelecommunications-
dc.subject.keywordPlusCLASS IMBALANCE-
dc.subject.keywordPlusEXPRESSIONS-
dc.subject.keywordAuthorConvolutional neural network-
dc.subject.keywordAuthorfacial emotion recognition-
dc.subject.keywordAuthordata oversampling-
dc.subject.keywordAuthorfacial action units-
dc.subject.keywordAuthordata imbalance-
Files in This Item
There are no files associated with this item.
Appears in
Collections
College of Engineering > Department of Electronics and Electrical Engineering > 1. Journal Articles

qrcode

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

Altmetrics

Total Views & Downloads

BROWSE