Vision-Based People Counter Using CNN-Based Event Classification
- Authors
- Cho, Sung In
- Issue Date
- Aug-2020
- Publisher
- IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
- Keywords
- Training; Cameras; Motion segmentation; Stacking; Feature extraction; Image segmentation; Training data; Convolutional neural network (CNN); data augmentation (DA); event classification; people counting
- Citation
- IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, v.69, no.8, pp 5308 - 5315
- Pages
- 8
- Indexed
- SCIE
SCOPUS
- Journal Title
- IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT
- Volume
- 69
- Number
- 8
- Start Page
- 5308
- End Page
- 5315
- URI
- https://scholarworks.dongguk.edu/handle/sw.dongguk/6386
- DOI
- 10.1109/TIM.2019.2959853
- ISSN
- 0018-9456
1557-9662
- Abstract
- This article proposes a convolutional neural network (CNN)-based people counter that classifies a given frame cube to a specific event that indicates people entering or exiting a target area to measure instantaneous people count. For the training of the proposed CNN, a training input frame cube and its corresponding class label that represents a specific event are generated using the proposed counting rules. For mitigating the overfitting problem that may occur in the training of the proposed CNN, data augmentation, and postclass correction using foreground distribution with event probabilities are applied. The experimental results indicate that the proposed method improved the F1 score and accuracy for the cumulative people counting results by up to 9.0% and 14.8%, respectively, compared with those of the benchmark methods, even though it calculated the cumulative count by summing instantaneous people counts, while the benchmark methods were optimized for the calculation of the cumulative count.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Advanced Convergence Engineering > Department of Computer Science and Artificial Intelligence > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.