CNN-Based Terrain Classification with Moisture Content Using RGB-IR Images

IF 0.9 Q4 ROBOTICS Journal of Robotics and Mechatronics Pub Date : 2021-12-20 DOI:10.20965/jrm.2021.p1294
Tomoya Goto, G. Ishigami
{"title":"CNN-Based Terrain Classification with Moisture Content Using RGB-IR Images","authors":"Tomoya Goto, G. Ishigami","doi":"10.20965/jrm.2021.p1294","DOIUrl":null,"url":null,"abstract":"Unmanned mobile robots in rough terrains are a key technology for achieving smart agriculture and smart construction. The mobility performance of robots highly depends on the moisture content of soil, and past few studies have focused on terrain classification using moisture content. In this study, we demonstrate a convolutional neural network-based terrain classification method using RGB-infrared (IR) images. The method first classifies soil types and then categorizes the moisture content of the terrain. A three-step image preprocessing for RGB-IR images is also integrated into the method that is applicable to an actual environment. An experimental study of the terrain classification confirmed that the proposed method achieved an accuracy of more than 99% in classifying the soil type. Furthermore, the classification accuracy of the moisture content was approximately 69% for pumice and 100% for dark soil. The proposed method can be useful for different scenarios, such as small-scale agriculture with mobile robots, smart agriculture for monitoring the moisture content, and earthworks in small areas.","PeriodicalId":51661,"journal":{"name":"Journal of Robotics and Mechatronics","volume":"33 1","pages":"1294-1302"},"PeriodicalIF":0.9000,"publicationDate":"2021-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Robotics and Mechatronics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20965/jrm.2021.p1294","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 1

Abstract

Unmanned mobile robots in rough terrains are a key technology for achieving smart agriculture and smart construction. The mobility performance of robots highly depends on the moisture content of soil, and past few studies have focused on terrain classification using moisture content. In this study, we demonstrate a convolutional neural network-based terrain classification method using RGB-infrared (IR) images. The method first classifies soil types and then categorizes the moisture content of the terrain. A three-step image preprocessing for RGB-IR images is also integrated into the method that is applicable to an actual environment. An experimental study of the terrain classification confirmed that the proposed method achieved an accuracy of more than 99% in classifying the soil type. Furthermore, the classification accuracy of the moisture content was approximately 69% for pumice and 100% for dark soil. The proposed method can be useful for different scenarios, such as small-scale agriculture with mobile robots, smart agriculture for monitoring the moisture content, and earthworks in small areas.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
基于cnn的RGB-IR图像含水率地形分类
崎岖地形无人驾驶移动机器人是实现智慧农业和智慧建筑的关键技术。机器人的移动性能在很大程度上取决于土壤的含水量,过去的研究主要集中在利用土壤含水量进行地形分类。在这项研究中,我们展示了一种基于卷积神经网络的地形分类方法,该方法使用rgb红外(IR)图像。该方法首先对土壤类型进行分类,然后对地形含水率进行分类。该方法还集成了RGB-IR图像的三步图像预处理,适用于实际环境。地形分类的实验研究证实,该方法对土壤类型的分类准确率达到99%以上。此外,浮石含水率的分类精度约为69%,暗土为100%。所提出的方法可以用于不同的场景,例如使用移动机器人的小规模农业,用于监测水分含量的智能农业,以及小区域的土方工程。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
2.20
自引率
36.40%
发文量
134
期刊介绍: First published in 1989, the Journal of Robotics and Mechatronics (JRM) has the longest publication history in the world in this field, publishing a total of over 2,000 works exclusively on robotics and mechatronics from the first number. The Journal publishes academic papers, development reports, reviews, letters, notes, and discussions. The JRM is a peer-reviewed journal in fields such as robotics, mechatronics, automation, and system integration. Its editorial board includes wellestablished researchers and engineers in the field from the world over. The scope of the journal includes any and all topics on robotics and mechatronics. As a key technology in robotics and mechatronics, it includes actuator design, motion control, sensor design, sensor fusion, sensor networks, robot vision, audition, mechanism design, robot kinematics and dynamics, mobile robot, path planning, navigation, SLAM, robot hand, manipulator, nano/micro robot, humanoid, service and home robots, universal design, middleware, human-robot interaction, human interface, networked robotics, telerobotics, ubiquitous robot, learning, and intelligence. The scope also includes applications of robotics and automation, and system integrations in the fields of manufacturing, construction, underwater, space, agriculture, sustainability, energy conservation, ecology, rescue, hazardous environments, safety and security, dependability, medical, and welfare.
期刊最新文献
Simplified System Integration of Robust Mobile Robot for Initial Pose Estimation for the Nakanoshima Robot Challenge Robust Cooperative Transport System with Model Error Compensator Using Multiple Robots with Suction Cups Learning Variable Admittance Control for Human-Robot Collaborative Manipulation High-Resolution Point Cloud Registration Method for Three-Dimensional Piping Measurements An Inchworm Robot with Self-Healing Ability Using SMA Actuators
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1