David Tseng, David Wang, Carolyn L. Chen, Lauren Miller, W. Song, J. Viers, S. Vougioukas, Stefano Carpin, J. A. Ojea, Ken Goldberg
{"title":"Towards Automating Precision Irrigation: Deep Learning to Infer Local Soil Moisture Conditions from Synthetic Aerial Agricultural Images","authors":"David Tseng, David Wang, Carolyn L. Chen, Lauren Miller, W. Song, J. Viers, S. Vougioukas, Stefano Carpin, J. A. Ojea, Ken Goldberg","doi":"10.1109/COASE.2018.8560431","DOIUrl":null,"url":null,"abstract":"Recent advances in unmanned aerial vehicles suggest that collecting aerial agricultural images can be cost-efficient, which can subsequently support automated precision irrigation. To study the potential for machine learning to learn local soil moisture conditions directly from such images, we developed a very fast, linear discrete-time simulation of plant growth based on the Richards equation. We use the simulator to generate large datasets of synthetic aerial images of a vineyard with known moisture conditions and then compare seven methods for inferring moisture conditions from images, in which the “uncorrelated plant” methods look at individual plants and the “correlated field” methods look at the entire vineyard: 1) constant prediction baseline, 2) linear Support Vector Machines (SVM), 3) Random Forests Uncorrelated Plant (RFUP), 4) Random Forests Correlated Field (RFCF), 5) two-layer Neural Networks (NN), 6) Deep Convolutional Neural Networks Uncorrelated Plant (CNNUP), and 7) Deep Convolutional Neural Networks Correlated Field (CNNCF). Experiments on held-out test images show that a globally-connected CNN performs best with normalized mean absolute error of 3.4%. Sensitivity experiments suggest that learned global CNNs are robust to injected noise in both the simulator and generated images as well as in the size of the training sets. In simulation, we compare the agricultural standard of flood irrigation to a proportional precision irrigation controller using the output of the global CNN and find that the latter can reduce water consumption by up to 52% and is also robust to errors in irrigation level, location, and timing. The first-order plant simulator and datasets are available at https://github.com/BerkeleyAutomation/RAPID.","PeriodicalId":6518,"journal":{"name":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","volume":"58 1","pages":"284-291"},"PeriodicalIF":0.0000,"publicationDate":"2018-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"25","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE 14th International Conference on Automation Science and Engineering (CASE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/COASE.2018.8560431","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 25
Abstract
Recent advances in unmanned aerial vehicles suggest that collecting aerial agricultural images can be cost-efficient, which can subsequently support automated precision irrigation. To study the potential for machine learning to learn local soil moisture conditions directly from such images, we developed a very fast, linear discrete-time simulation of plant growth based on the Richards equation. We use the simulator to generate large datasets of synthetic aerial images of a vineyard with known moisture conditions and then compare seven methods for inferring moisture conditions from images, in which the “uncorrelated plant” methods look at individual plants and the “correlated field” methods look at the entire vineyard: 1) constant prediction baseline, 2) linear Support Vector Machines (SVM), 3) Random Forests Uncorrelated Plant (RFUP), 4) Random Forests Correlated Field (RFCF), 5) two-layer Neural Networks (NN), 6) Deep Convolutional Neural Networks Uncorrelated Plant (CNNUP), and 7) Deep Convolutional Neural Networks Correlated Field (CNNCF). Experiments on held-out test images show that a globally-connected CNN performs best with normalized mean absolute error of 3.4%. Sensitivity experiments suggest that learned global CNNs are robust to injected noise in both the simulator and generated images as well as in the size of the training sets. In simulation, we compare the agricultural standard of flood irrigation to a proportional precision irrigation controller using the output of the global CNN and find that the latter can reduce water consumption by up to 52% and is also robust to errors in irrigation level, location, and timing. The first-order plant simulator and datasets are available at https://github.com/BerkeleyAutomation/RAPID.