Flow analysis-based fast-moving flow calibration for a people-counting system
- Authors
- Park, Jae Hyeon; Cho, Sung In
- Issue Date
- Sep-2021
- Publisher
- SPRINGER
- Keywords
- Vision-based people-counting; Flow analysis; Foreground extraction; LOI-based people-counting
- Citation
- MULTIMEDIA TOOLS AND APPLICATIONS, v.80, no.21-23, pp 31671 - 31685
- Pages
- 15
- Indexed
- SCIE
SCOPUS
- Journal Title
- MULTIMEDIA TOOLS AND APPLICATIONS
- Volume
- 80
- Number
- 21-23
- Start Page
- 31671
- End Page
- 31685
- URI
- https://scholarworks.dongguk.edu/handle/sw.dongguk/4554
- DOI
- 10.1007/s11042-021-11231-1
- ISSN
- 1380-7501
1573-7721
- Abstract
- We propose a new vision-based people-counting method that uses flow analysis with the movement speed of a person to increase the accuracy of people-counting. The proposed method consists of two procedures: simple estimation of foreground movement speed and multiple people detection based on the flow analysis. First, we extract the flow that is generated by the movements of the foreground, and its volume that is calculated by accumulating the foreground pixels on a line of interest (LOI) while people enter and exit the target region. Second, the number of frames containing the foreground in the LOI for each entry and exit event is counted to estimate the speed of the flow cluster. Finally, the number of people is estimated from the flow volume (FV) and the number of frames. In the experimental results, the proposed method enhanced the average F1 score and accuracy by up to 25% and 9%, respectively, compared to existing people-counting methods. The results confirmed that the proposed method achieved substantial accuracy improvements over existing methods when the person passed the target region for various speed patterns.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Advanced Convergence Engineering > Department of Computer Science and Artificial Intelligence > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.