Pub Date : 2024-10-24DOI: 10.1109/TAFE.2024.3483630
{"title":"2024 Index IEEE Transactions on AgriFood Electronics Vol. 2","authors":"","doi":"10.1109/TAFE.2024.3483630","DOIUrl":"https://doi.org/10.1109/TAFE.2024.3483630","url":null,"abstract":"","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"2 2","pages":"638-652"},"PeriodicalIF":0.0,"publicationDate":"2024-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10734674","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142540444","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-10-10DOI: 10.1109/TAFE.2024.3472304
{"title":"IEEE Circuits and Systems Society Information","authors":"","doi":"10.1109/TAFE.2024.3472304","DOIUrl":"https://doi.org/10.1109/TAFE.2024.3472304","url":null,"abstract":"","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"2 2","pages":"C2-C2"},"PeriodicalIF":0.0,"publicationDate":"2024-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10713441","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142430769","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-10-10DOI: 10.1109/TAFE.2024.3472308
{"title":"IEEE Circuits and Systems Society Information","authors":"","doi":"10.1109/TAFE.2024.3472308","DOIUrl":"https://doi.org/10.1109/TAFE.2024.3472308","url":null,"abstract":"","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"2 2","pages":"C3-C3"},"PeriodicalIF":0.0,"publicationDate":"2024-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10713404","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142430879","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-10-10DOI: 10.1109/TAFE.2024.3468408
Francois Rivet;Matías Miguez
The global food and agriculture industry is rapidly evolving, driven by advances in electronic technologies and data-driven methodologies. These innovations are critical to addressing the pressing challenges of food security, sustainable farming, and precision agriculture. The first edition of the IEEE Conference on AgriFood Electronics (CAFE 2023) was held in Torino, Italy. It highlighted the groundbreaking research in these areas, bringing together experts from academia and industry to discuss the latest technological advancements in agrifood electronics.
{"title":"Guest Editorial Special Issue on IEEE Conference on AgriFood Electronics (CAFE 2023)","authors":"Francois Rivet;Matías Miguez","doi":"10.1109/TAFE.2024.3468408","DOIUrl":"https://doi.org/10.1109/TAFE.2024.3468408","url":null,"abstract":"The global food and agriculture industry is rapidly evolving, driven by advances in electronic technologies and data-driven methodologies. These innovations are critical to addressing the pressing challenges of food security, sustainable farming, and precision agriculture. The first edition of the IEEE Conference on AgriFood Electronics (CAFE 2023) was held in Torino, Italy. It highlighted the groundbreaking research in these areas, bringing together experts from academia and industry to discuss the latest technological advancements in agrifood electronics.","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"2 2","pages":"168-169"},"PeriodicalIF":0.0,"publicationDate":"2024-10-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10713412","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142430882","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Water is a not-so-renewable resource. Agriculture is impacting for more than 70% of fresh water use worldwide. Considering the increase of population it is fundamental to act in order to reduce water usage. The WAPPFRUIT project aims to design an automatic irrigation system, based on data of water availability in the soil gathered directly in the orchards. Matric potential data are used to determine the exact water demand of the trees, thanks to specific thresholds adapted to the actual soil and crop type. Furthermore, an electronic system based on simple, small, and ultra-low-power devices works together an automatic algorithm to manage the watering events. We tested this approach in three orchards in north-west Italy, comparing our approach to the one used by the farmers. The results show an average water saving of nearly 50% keeping the fruit production comparable to the reference solution. This approach is a clear example of how electronics and technology can really impact agriculture and food production.
{"title":"WAPPFRUIT—An Automatic System for Drip Irrigation in Orchards Based on Real-Time Soil Matric Potential Data","authors":"Mattia Barezzi;Alessandro Sanginario;Davide Canone;Davide Gisolo;Alessio Gentile;Luca Nari;Francesca Pettiti;Umberto Garlando","doi":"10.1109/TAFE.2024.3455171","DOIUrl":"https://doi.org/10.1109/TAFE.2024.3455171","url":null,"abstract":"Water is a not-so-renewable resource. Agriculture is impacting for more than 70% of fresh water use worldwide. Considering the increase of population it is fundamental to act in order to reduce water usage. The WAPPFRUIT project aims to design an automatic irrigation system, based on data of water availability in the soil gathered directly in the orchards. Matric potential data are used to determine the exact water demand of the trees, thanks to specific thresholds adapted to the actual soil and crop type. Furthermore, an electronic system based on simple, small, and ultra-low-power devices works together an automatic algorithm to manage the watering events. We tested this approach in three orchards in north-west Italy, comparing our approach to the one used by the farmers. The results show an average water saving of nearly 50% keeping the fruit production comparable to the reference solution. This approach is a clear example of how electronics and technology can really impact agriculture and food production.","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"2 2","pages":"293-305"},"PeriodicalIF":0.0,"publicationDate":"2024-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142430861","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-10-01DOI: 10.1109/TAFE.2024.3433348
M. Nagaraju;Priyanka Chawla
One of India's main crops, maize, accounts for 2–3% of global production. Disease detection in maize fields has become increasingly difficult due to a lack of knowledge about disease symptoms. Furthermore, manual disease detection methods take a lot of time and are not effective. Recent developments in convolutional neural networks (CNNs) have exhibited remarkable performance in disease recognition and classification. A CNN is a deep learning technique that extracts the features from an image and performs the disease classification effectively. The optimization of hyperparameters is a tedious problem that impacts the performance of a model. The main purpose of the present research is to support future research to configure suitable hyperparameters to a model. In the present work, a deep CNN is proposed for the classification of seven different diseases of maize crop. Several hyperparameters, such as image size, batch size, number of epochs, optimizers, learning rate, kernel size, and number of hidden layers, were tested with various values in the experimental approach. The obtained results show that running the model for 200 epochs improved the classification accuracy with 87.44%. It also states that choosing input image sizes of 168 × 168 and 224 × 224 resulted in a good classification accuracy of 84.66% and 85.23%, respectively. The proposed deep CNN model has attained 85.83% classification accuracy with the Adam optimizer and a learning rate of 0.001. However, the results achieved by other optimizers, such as root-mean-square propagation (81.95%) and stochastic gradient descent (79.66%), are not better when compared with the Adam optimizer. Finally, the results have provided a better knowledge in selecting appropriate hyperparameters to the application of plant disease classification.
{"title":"Deep Learning-Based Maize Crop Disease Classification Model in Telangana Region of South India","authors":"M. Nagaraju;Priyanka Chawla","doi":"10.1109/TAFE.2024.3433348","DOIUrl":"https://doi.org/10.1109/TAFE.2024.3433348","url":null,"abstract":"One of India's main crops, maize, accounts for 2–3% of global production. Disease detection in maize fields has become increasingly difficult due to a lack of knowledge about disease symptoms. Furthermore, manual disease detection methods take a lot of time and are not effective. Recent developments in convolutional neural networks (CNNs) have exhibited remarkable performance in disease recognition and classification. A CNN is a deep learning technique that extracts the features from an image and performs the disease classification effectively. The optimization of hyperparameters is a tedious problem that impacts the performance of a model. The main purpose of the present research is to support future research to configure suitable hyperparameters to a model. In the present work, a deep CNN is proposed for the classification of seven different diseases of maize crop. Several hyperparameters, such as image size, batch size, number of epochs, optimizers, learning rate, kernel size, and number of hidden layers, were tested with various values in the experimental approach. The obtained results show that running the model for 200 epochs improved the classification accuracy with 87.44%. It also states that choosing input image sizes of 168 × 168 and 224 × 224 resulted in a good classification accuracy of 84.66% and 85.23%, respectively. The proposed deep CNN model has attained 85.83% classification accuracy with the Adam optimizer and a learning rate of 0.001. However, the results achieved by other optimizers, such as root-mean-square propagation (81.95%) and stochastic gradient descent (79.66%), are not better when compared with the Adam optimizer. Finally, the results have provided a better knowledge in selecting appropriate hyperparameters to the application of plant disease classification.","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"2 2","pages":"627-637"},"PeriodicalIF":0.0,"publicationDate":"2024-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142430799","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Pub Date : 2024-10-01DOI: 10.1109/TAFE.2024.3460970
Claudio Tomazzoli;Davide Quaglia;Sara Migliorini
Assuming climatic homogeneity is no longer acceptable in greenhouse farming since it can result in less-than-ideal agronomic decisions. Indeed, several approaches have been proposed based on installing sensors in predefined points of interest (PoIs) to obtain a better mapping of climatic conditions. However, these approaches suffer from two main problems, i.e., identifying the most significant PoIs inside the greenhouse and placing a sensor at each PoI, which may be costly and incompatible with field operations. As regards the first problem, we propose a genetic algorithm to identify the best sensing places based on the agronomic definition of zones of interest. As regards the second problem, we exploit agricultural robots to collect climatic information to train a set of virtual sensors based on recurrent neural networks. The proposed solution has been tested on a real-world dataset regarding a greenhouse in Verona (Italy).
{"title":"Planning the Greenhouse Climatic Mapping Using an Agricultural Robot and Recurrent-Neural- Network-Based Virtual Sensors","authors":"Claudio Tomazzoli;Davide Quaglia;Sara Migliorini","doi":"10.1109/TAFE.2024.3460970","DOIUrl":"https://doi.org/10.1109/TAFE.2024.3460970","url":null,"abstract":"Assuming climatic homogeneity is no longer acceptable in greenhouse farming since it can result in less-than-ideal agronomic decisions. Indeed, several approaches have been proposed based on installing sensors in predefined points of interest (PoIs) to obtain a better mapping of climatic conditions. However, these approaches suffer from two main problems, i.e., identifying the most significant PoIs inside the greenhouse and placing a sensor at each PoI, which may be costly and incompatible with field operations. As regards the first problem, we propose a genetic algorithm to identify the best sensing places based on the agronomic definition of zones of interest. As regards the second problem, we exploit agricultural robots to collect climatic information to train a set of virtual sensors based on recurrent neural networks. The proposed solution has been tested on a real-world dataset regarding a greenhouse in Verona (Italy).","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"2 2","pages":"617-626"},"PeriodicalIF":0.0,"publicationDate":"2024-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10701545","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"142409064","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
The codling moth pest poses a significant threat to global crop production, with potential losses of up to 80% in apple orchards. Special camera-based sensor nodes are deployed in the field to record and transmit images of trapped insects to monitor the presence of the pest. This article investigates the embedding of computer vision algorithms in the sensor node using a novel state-of-the-art microcontroller unit (MCU), the GreenWaves Technologies' GAP9 system-on-chip, which combines 10 RISC-V general purposes cores with a convolution hardware accelerator. We compare the performance of a lightweight Viola–Jones detector algorithm with a convolutional neural network (CNN), MobileNetV3-SSDLite, trained for the pest detection task. On two datasets that differentiate for the distance between the camera sensor and the pest targets, the CNN generalizes better than the other method and achieves a detection accuracy between 83% and 72%. Thanks to the GAP9’s CNN accelerator, the CNN inference task takes only $text{147 ms}$