{"title":"Real-time Traffic Monitoring System Based on Deep Learning and YOLOv8","authors":"Saif B. Neamah, Abdulamir A. Karim","doi":"10.14500/aro.11327","DOIUrl":null,"url":null,"abstract":"Computer vision applications are important nowadays because they provide solutions to critical problems that relate to traffic in a cost-effective manner to reduce accidents and preserve lives. This paper proposes a system for real-time traffic monitoring based on cutting-edge deep learning techniques through the state-of-the-art you-only-look-once v8 algorithm, benefiting from its functionalities to provide vehicle detection, classification, and segmentation. The proposed work provides various important traffic information, including vehicle counting, classification, speed estimation, and size estimation. This information helps enforce traffic laws. The proposed system consists of five stages: The preprocessing stage, which includes camera calibration, ROI calculation, and preparing the source video input; the vehicle detection stage, which uses the convolutional neural network model to localize vehicles in the video frames; the tracking stage, which uses the ByteTrack algorithm to track the detected vehicles; the speed estimation stage, which estimates the speed for the tracked vehicles; and the size estimation stage, which estimates the vehicle size. The results of the proposed system running on the Nvidia GTX 1070 GPU show that the detection and tracking stages have an average accuracy of 96.58% with an average error of 3.42%, the vehicle counting stage has an average accuracy of 97.54% with a 2.46% average error, the speed estimation stage has an average accuracy of 96.75% with a 3.25% average error, and the size estimation stage has an average accuracy of 87.28% with a 12.72% average error.","PeriodicalId":8398,"journal":{"name":"ARO-THE SCIENTIFIC JOURNAL OF KOYA UNIVERSITY","volume":null,"pages":null},"PeriodicalIF":1.2000,"publicationDate":"2023-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ARO-THE SCIENTIFIC JOURNAL OF KOYA UNIVERSITY","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.14500/aro.11327","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Computer vision applications are important nowadays because they provide solutions to critical problems that relate to traffic in a cost-effective manner to reduce accidents and preserve lives. This paper proposes a system for real-time traffic monitoring based on cutting-edge deep learning techniques through the state-of-the-art you-only-look-once v8 algorithm, benefiting from its functionalities to provide vehicle detection, classification, and segmentation. The proposed work provides various important traffic information, including vehicle counting, classification, speed estimation, and size estimation. This information helps enforce traffic laws. The proposed system consists of five stages: The preprocessing stage, which includes camera calibration, ROI calculation, and preparing the source video input; the vehicle detection stage, which uses the convolutional neural network model to localize vehicles in the video frames; the tracking stage, which uses the ByteTrack algorithm to track the detected vehicles; the speed estimation stage, which estimates the speed for the tracked vehicles; and the size estimation stage, which estimates the vehicle size. The results of the proposed system running on the Nvidia GTX 1070 GPU show that the detection and tracking stages have an average accuracy of 96.58% with an average error of 3.42%, the vehicle counting stage has an average accuracy of 97.54% with a 2.46% average error, the speed estimation stage has an average accuracy of 96.75% with a 3.25% average error, and the size estimation stage has an average accuracy of 87.28% with a 12.72% average error.