{"title":"Unsupervised Image Segmentation using Convolutional Neural Networks for Automated Crop Monitoring","authors":"Prakruti V. Bhatt, Sanat Sarangi, S. Pappula","doi":"10.5220/0007687508870893","DOIUrl":null,"url":null,"abstract":"Among endeavors towards automation in agriculture, localization and segmentation of various events during the growth cycle of a crop is critical and can be challenging in a dense foliage. Convolutional Neural Network based methods have been used to achieve state-of-the-art results in supervised image segmentation. In this paper, we investigate the unsupervised method of segmentation for monitoring crop growth and health conditions. Individual segments are then evaluated for their size, color, and texture in order to measure the possible change in the crop like emergence of a flower, fruit, deficiency, disease or pest. Supervised methods require ground truth labels of the segments in a large number of the images for training a neural network which can be used for similar kind of images on which the network is trained. Instead, we use information of spatial continuity in pixels and boundaries in a given image to update the feature representation and label assignment to every pixel using a fully convolutional network. Given that manual labeling of crop images is time consuming but quantifying an event occurrence in the farm is of utmost importance, our proposed approach achieves promising results on images of crops captured in different conditions. We obtained 94% accuracy in segmenting Cabbage with Black Moth pest, 81% in getting segments affected by Helopeltis pest on Tea leaves and 92% in spotting fruits on a Citrus tree where accuracy is defined in terms of intersection over union of the resulting segments with the ground truth. The resulting segments have been used for temporal crop monitoring and severity measurement in case of disease or pest manifestations.","PeriodicalId":410036,"journal":{"name":"International Conference on Pattern Recognition Applications and Methods","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Pattern Recognition Applications and Methods","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5220/0007687508870893","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Among endeavors towards automation in agriculture, localization and segmentation of various events during the growth cycle of a crop is critical and can be challenging in a dense foliage. Convolutional Neural Network based methods have been used to achieve state-of-the-art results in supervised image segmentation. In this paper, we investigate the unsupervised method of segmentation for monitoring crop growth and health conditions. Individual segments are then evaluated for their size, color, and texture in order to measure the possible change in the crop like emergence of a flower, fruit, deficiency, disease or pest. Supervised methods require ground truth labels of the segments in a large number of the images for training a neural network which can be used for similar kind of images on which the network is trained. Instead, we use information of spatial continuity in pixels and boundaries in a given image to update the feature representation and label assignment to every pixel using a fully convolutional network. Given that manual labeling of crop images is time consuming but quantifying an event occurrence in the farm is of utmost importance, our proposed approach achieves promising results on images of crops captured in different conditions. We obtained 94% accuracy in segmenting Cabbage with Black Moth pest, 81% in getting segments affected by Helopeltis pest on Tea leaves and 92% in spotting fruits on a Citrus tree where accuracy is defined in terms of intersection over union of the resulting segments with the ground truth. The resulting segments have been used for temporal crop monitoring and severity measurement in case of disease or pest manifestations.