Hybrid Traffic Accident Classification Modelsopen access
- Authors
- Zhang, Yihang; Sung, Yunsick
- Issue Date
- Feb-2023
- Publisher
- MDPI
- Keywords
- traffic accident classification; trajectory tracking; YOLO; Deep SORT; convolutional neural network; vision transformer
- Citation
- Mathematics, v.11, no.4, pp 1 - 16
- Pages
- 16
- Indexed
- SCIE
SCOPUS
- Journal Title
- Mathematics
- Volume
- 11
- Number
- 4
- Start Page
- 1
- End Page
- 16
- URI
- https://scholarworks.dongguk.edu/handle/sw.dongguk/19208
- DOI
- 10.3390/math11041050
- ISSN
- 2227-7390
2227-7390
- Abstract
- Traffic closed-circuit television (CCTV) devices can be used to detect and track objects on roads by designing and applying artificial intelligence and deep learning models. However, extracting useful information from the detected objects and determining the occurrence of traffic accidents are usually difficult. This paper proposes a CCTV frame-based hybrid traffic accident classification model that enables the identification of whether a frame includes accidents by generating object trajectories. The proposed model utilizes a Vision Transformer (ViT) and a Convolutional Neural Network (CNN) to extract latent representations from each frame and corresponding trajectories. The fusion of frame and trajectory features was performed to improve the traffic accident classification ability of the proposed hybrid method. In the experiments, the Car Accident Detection and Prediction (CADP) dataset was used to train the hybrid model, and the accuracy of the model was approximately 97%. The experimental results indicate that the proposed hybrid method demonstrates an improved classification performance compared to traditional models.
- Files in This Item
- There are no files associated with this item.
- Appears in
Collections - College of Advanced Convergence Engineering > Department of Computer Science and Artificial Intelligence > 1. Journal Articles

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.