AGRI-SLAM: a real-time stereo visual SLAM for agricultural environment

IF 3.7 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE Autonomous Robots Pub Date : 2023-07-04 DOI:10.1007/s10514-023-10110-y
Rafiqul Islam, Habibullah Habibullah, Tagor Hossain
{"title":"AGRI-SLAM: a real-time stereo visual SLAM for agricultural environment","authors":"Rafiqul Islam,&nbsp;Habibullah Habibullah,&nbsp;Tagor Hossain","doi":"10.1007/s10514-023-10110-y","DOIUrl":null,"url":null,"abstract":"<div><p>In this research, we proposed a stereo visual simultaneous localisation and mapping (SLAM) system that efficiently works in agricultural scenarios without compromising the performance and accuracy in contrast to the other state-of-the-art methods. The proposed system is equipped with an image enhancement technique for the ORB point and LSD line features recovery, which enables it to work in broader scenarios and gives extensive spatial information from the low-light and hazy agricultural environment. Firstly, the method has been tested on the standard dataset, i.e., KITTI and EuRoC, to validate the localisation accuracy by comparing it with the other state-of-the-art methods, namely VINS-SLAM, PL-SLAM, and ORB-SLAM2. The experimental results evidence that the proposed method obtains superior localisation and mapping accuracy than the other visual SLAM methods. Secondly, the proposed method is tested on the ROSARIO dataset, our low-light agricultural dataset, and O-HAZE dataset to validate the performance in agricultural environments. In such cases, while other methods fail to operate in such complex agricultural environments, our method successfully operates with high localisation and mapping accuracy.</p></div>","PeriodicalId":55409,"journal":{"name":"Autonomous Robots","volume":null,"pages":null},"PeriodicalIF":3.7000,"publicationDate":"2023-07-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10514-023-10110-y.pdf","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Autonomous Robots","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10514-023-10110-y","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 1

Abstract

In this research, we proposed a stereo visual simultaneous localisation and mapping (SLAM) system that efficiently works in agricultural scenarios without compromising the performance and accuracy in contrast to the other state-of-the-art methods. The proposed system is equipped with an image enhancement technique for the ORB point and LSD line features recovery, which enables it to work in broader scenarios and gives extensive spatial information from the low-light and hazy agricultural environment. Firstly, the method has been tested on the standard dataset, i.e., KITTI and EuRoC, to validate the localisation accuracy by comparing it with the other state-of-the-art methods, namely VINS-SLAM, PL-SLAM, and ORB-SLAM2. The experimental results evidence that the proposed method obtains superior localisation and mapping accuracy than the other visual SLAM methods. Secondly, the proposed method is tested on the ROSARIO dataset, our low-light agricultural dataset, and O-HAZE dataset to validate the performance in agricultural environments. In such cases, while other methods fail to operate in such complex agricultural environments, our method successfully operates with high localisation and mapping accuracy.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
农业环境实时立体视觉SLAM
在这项研究中,我们提出了一种立体视觉同时定位和绘图(SLAM)系统,与其他最先进的方法相比,该系统在不影响性能和准确性的情况下有效地工作在农业场景中。所提出的系统配备了用于ORB点和LSD线特征恢复的图像增强技术,这使其能够在更广泛的场景中工作,并从弱光和朦胧的农业环境中提供广泛的空间信息。首先,该方法已在标准数据集(即KITTI和EuRoC)上进行了测试,通过将其与其他最先进的方法(即VINS-SLAM、PL-SLAM和ORB-SLAM2)进行比较来验证定位精度。实验结果表明,与其他视觉SLAM方法相比,该方法获得了更好的定位和映射精度。其次,在ROSARIO数据集、我们的低光农业数据集和O-HAZE数据集上测试了所提出的方法,以验证其在农业环境中的性能。在这种情况下,虽然其他方法无法在如此复杂的农业环境中运行,但我们的方法成功地以高定位和绘图精度运行。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Autonomous Robots
Autonomous Robots 工程技术-机器人学
CiteScore
7.90
自引率
5.70%
发文量
46
审稿时长
3 months
期刊介绍: Autonomous Robots reports on the theory and applications of robotic systems capable of some degree of self-sufficiency. It features papers that include performance data on actual robots in the real world. Coverage includes: control of autonomous robots · real-time vision · autonomous wheeled and tracked vehicles · legged vehicles · computational architectures for autonomous systems · distributed architectures for learning, control and adaptation · studies of autonomous robot systems · sensor fusion · theory of autonomous systems · terrain mapping and recognition · self-calibration and self-repair for robots · self-reproducing intelligent structures · genetic algorithms as models for robot development. The focus is on the ability to move and be self-sufficient, not on whether the system is an imitation of biology. Of course, biological models for robotic systems are of major interest to the journal since living systems are prototypes for autonomous behavior.
期刊最新文献
Optimal policies for autonomous navigation in strong currents using fast marching trees A concurrent learning approach to monocular vision range regulation of leader/follower systems Correction: Planning under uncertainty for safe robot exploration using gaussian process prediction Dynamic event-triggered integrated task and motion planning for process-aware source seeking Continuous planning for inertial-aided systems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1