Proposal and Evaluation of A Method for Automatically Classifying Images of Agricultural Work and Animals Acquired with Motion Sensor Cameras

Masanori Takagi, Ryu Hirano
{"title":"Proposal and Evaluation of A Method for Automatically Classifying Images of Agricultural Work and Animals Acquired with Motion Sensor Cameras","authors":"Masanori Takagi, Ryu Hirano","doi":"10.1109/INFOCT.2019.8710963","DOIUrl":null,"url":null,"abstract":"We have developed a field monitoring system (http://kansatu.net) using network-connected cameras and sensors installed in apple trees to support the hands-on agriculture curriculum of an elementary school. The system enables to collects and stores a variety of images taken by network-connected cameras equipped with infrared motion sensors. Agricultural works and animals are photographed in these images. The system has been in place seven years since 2011, during which time, there have been years in which more than approximately 50,000 images were collected. However, these data have not been used effectively to learn agricultural works. On the other side, in recent years, damage to agricultural crops by wild animals has been increasing. From these backgrounds, our ultimate goal is to utilize the images captured by motion sensor-equipped network cameras to develop countermeasures to prevent wild animals from damaging agricultural crops and to educate and cultivate future farmers. In this study, we proposed a method that employs image analysis technology and the date and time of image capture to automatically classify images acquired by the motion sensor network cameras by type of agricultural work and animal. We also developed a method for automatically classifying the type of agricultural work. We evaluated the accuracy of the developed method by comparing the results of automated classification with the results of manual classification. The recall of the proposed method exceeded 90% for all three types of agricultural work tested, which was equal to or greater than the classification accuracy achieved with manual classification.","PeriodicalId":369231,"journal":{"name":"2019 IEEE 2nd International Conference on Information and Computer Technologies (ICICT)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 2nd International Conference on Information and Computer Technologies (ICICT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INFOCT.2019.8710963","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

We have developed a field monitoring system (http://kansatu.net) using network-connected cameras and sensors installed in apple trees to support the hands-on agriculture curriculum of an elementary school. The system enables to collects and stores a variety of images taken by network-connected cameras equipped with infrared motion sensors. Agricultural works and animals are photographed in these images. The system has been in place seven years since 2011, during which time, there have been years in which more than approximately 50,000 images were collected. However, these data have not been used effectively to learn agricultural works. On the other side, in recent years, damage to agricultural crops by wild animals has been increasing. From these backgrounds, our ultimate goal is to utilize the images captured by motion sensor-equipped network cameras to develop countermeasures to prevent wild animals from damaging agricultural crops and to educate and cultivate future farmers. In this study, we proposed a method that employs image analysis technology and the date and time of image capture to automatically classify images acquired by the motion sensor network cameras by type of agricultural work and animal. We also developed a method for automatically classifying the type of agricultural work. We evaluated the accuracy of the developed method by comparing the results of automated classification with the results of manual classification. The recall of the proposed method exceeded 90% for all three types of agricultural work tested, which was equal to or greater than the classification accuracy achieved with manual classification.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
一种运动传感器采集的农工和动物图像自动分类方法的提出与评价
我们开发了一个现场监测系统(http://kansatu.net),使用安装在苹果树上的联网摄像机和传感器,以支持一所小学的实践农业课程。该系统能够收集和存储由配备红外运动传感器的网络连接摄像机拍摄的各种图像。这些图像拍摄了农业工程和动物。自2011年以来,该系统已经运行了7年,在此期间,有几年收集了大约5万多张图像。然而,这些数据并没有被有效地用于学习农业工作。另一方面,近年来,野生动物对农作物的破坏一直在增加。基于这些背景,我们的最终目标是利用配备运动传感器的网络摄像机捕获的图像来制定对策,以防止野生动物破坏农作物,并教育和培养未来的农民。在本研究中,我们提出了一种利用图像分析技术和图像捕获的日期和时间对运动传感器网络摄像机获取的图像进行农业劳动类型和动物类型自动分类的方法。我们还开发了一种自动分类农业工作类型的方法。我们通过比较自动分类结果和人工分类结果来评估所开发方法的准确性。对于所有三种类型的农业工作,所提出的方法的召回率超过90%,这等于或大于人工分类所达到的分类精度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Utilization of Data Mining for Generalizable, All-Admission Prediction of Inpatient Mortality Development of Navigation Monitoring & Assistance Service Data Model ITIKI Plus: A Mobile Based Application for Integrating Indigenous Knowledge and Scientific Agro-Climate Decision Support for Africa’s Small-Scale Farmers TFDroid: Android Malware Detection by Topics and Sensitive Data Flows Using Machine Learning Techniques Weighted DV-Hop Localization Algorithm for Wireless Sensor Network based on Differential Evolution Algorithm
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1