Jonathan S. Ignacio, Katreen Nicole A. Eisma, M. V. Caya
{"title":"A YOLOv5-based Deep Learning Model for In-Situ Detection and Maturity Grading of Mango","authors":"Jonathan S. Ignacio, Katreen Nicole A. Eisma, M. V. Caya","doi":"10.1109/ICCIS56375.2022.9998163","DOIUrl":null,"url":null,"abstract":"In this paper, the researchers focused on identifying mangoes using YOLOv5 and classifying their degree of ripeness using CIELAB color space. The researchers used YOLOv5 to identify mangoes and CIELAB to classify their ripeness in this study. CIELAB is the recommended color system because it closely resembles how the human eye perceives color. Because the OpenCV feature reads frames in RGB, the researchers need a way to transform those RGB pixels into the more common L*a*b* color space. L* (lightness), a* (red-green), and b* (blue-yellow) are the three dimensions in which colors are defined in the CIELAB system (yellow-blue). The researchers used both the* and b* channels to determine the range of ripe and unripe mangoes. The system operates on a Raspberry Pi 4 connected to a camera that captures real-time video feed. Results showed that the gadget could recognize mangoes even when mixed with other fruit types with comparable sizes, shapes, and colors. However, there are times when the device may mistake a lemon for a mango, possibly due to the camera’s poor color perception, darkened areas within the acquired image, or insufficient illumination. The researchers suggest selecting a response time between three and five seconds as the camera must accurately detect the fruit. Overall, the accuracy values obtained from the tests done under artificial and actual in-store lighting were 86.26% and 83.08%.","PeriodicalId":398546,"journal":{"name":"2022 6th International Conference on Communication and Information Systems (ICCIS)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 6th International Conference on Communication and Information Systems (ICCIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCIS56375.2022.9998163","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In this paper, the researchers focused on identifying mangoes using YOLOv5 and classifying their degree of ripeness using CIELAB color space. The researchers used YOLOv5 to identify mangoes and CIELAB to classify their ripeness in this study. CIELAB is the recommended color system because it closely resembles how the human eye perceives color. Because the OpenCV feature reads frames in RGB, the researchers need a way to transform those RGB pixels into the more common L*a*b* color space. L* (lightness), a* (red-green), and b* (blue-yellow) are the three dimensions in which colors are defined in the CIELAB system (yellow-blue). The researchers used both the* and b* channels to determine the range of ripe and unripe mangoes. The system operates on a Raspberry Pi 4 connected to a camera that captures real-time video feed. Results showed that the gadget could recognize mangoes even when mixed with other fruit types with comparable sizes, shapes, and colors. However, there are times when the device may mistake a lemon for a mango, possibly due to the camera’s poor color perception, darkened areas within the acquired image, or insufficient illumination. The researchers suggest selecting a response time between three and five seconds as the camera must accurately detect the fruit. Overall, the accuracy values obtained from the tests done under artificial and actual in-store lighting were 86.26% and 83.08%.