Visual route following for tiny autonomous robots

IF 26.1 1区 计算机科学 Q1 ROBOTICS Science Robotics Pub Date : 2024-07-17 DOI:10.1126/scirobotics.adk0310
Tom van Dijk, Christophe De Wagter, Guido C. H. E. de Croon
{"title":"Visual route following for tiny autonomous robots","authors":"Tom van Dijk,&nbsp;Christophe De Wagter,&nbsp;Guido C. H. E. de Croon","doi":"10.1126/scirobotics.adk0310","DOIUrl":null,"url":null,"abstract":"<div >Navigation is an essential capability for autonomous robots. In particular, visual navigation has been a major research topic in robotics because cameras are lightweight, power-efficient sensors that provide rich information on the environment. However, the main challenge of visual navigation is that it requires substantial computational power and memory for visual processing and storage of the results. As of yet, this has precluded its use on small, extremely resource-constrained robots such as lightweight drones. Inspired by the parsimony of natural intelligence, we propose an insect-inspired approach toward visual navigation that is specifically aimed at extremely resource-restricted robots. It is a route-following approach in which a robot’s outbound trajectory is stored as a collection of highly compressed panoramic images together with their spatial relationships as measured with odometry. During the inbound journey, the robot uses a combination of odometry and visual homing to return to the stored locations, with visual homing preventing the buildup of odometric drift. A main advancement of the proposed strategy is that the number of stored compressed images is minimized by spacing them apart as far as the accuracy of odometry allows. To demonstrate the suitability for small systems, we implemented the strategy on a tiny 56-gram drone. The drone could successfully follow routes up to 100 meters with a trajectory representation that consumed less than 20 bytes per meter. The presented method forms a substantial step toward the autonomous visual navigation of tiny robots, facilitating their more widespread application.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"9 92","pages":""},"PeriodicalIF":26.1000,"publicationDate":"2024-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.science.org/doi/reader/10.1126/scirobotics.adk0310","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Science Robotics","FirstCategoryId":"94","ListUrlMain":"https://www.science.org/doi/10.1126/scirobotics.adk0310","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Navigation is an essential capability for autonomous robots. In particular, visual navigation has been a major research topic in robotics because cameras are lightweight, power-efficient sensors that provide rich information on the environment. However, the main challenge of visual navigation is that it requires substantial computational power and memory for visual processing and storage of the results. As of yet, this has precluded its use on small, extremely resource-constrained robots such as lightweight drones. Inspired by the parsimony of natural intelligence, we propose an insect-inspired approach toward visual navigation that is specifically aimed at extremely resource-restricted robots. It is a route-following approach in which a robot’s outbound trajectory is stored as a collection of highly compressed panoramic images together with their spatial relationships as measured with odometry. During the inbound journey, the robot uses a combination of odometry and visual homing to return to the stored locations, with visual homing preventing the buildup of odometric drift. A main advancement of the proposed strategy is that the number of stored compressed images is minimized by spacing them apart as far as the accuracy of odometry allows. To demonstrate the suitability for small systems, we implemented the strategy on a tiny 56-gram drone. The drone could successfully follow routes up to 100 meters with a trajectory representation that consumed less than 20 bytes per meter. The presented method forms a substantial step toward the autonomous visual navigation of tiny robots, facilitating their more widespread application.

Abstract Image

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
微型自主机器人的视觉路线跟踪
导航是自主机器人的一项基本能力。特别是,视觉导航一直是机器人学的一个重要研究课题,因为摄像头是一种轻便、省电的传感器,能提供丰富的环境信息。然而,视觉导航的主要挑战在于它需要大量的计算能力和内存来进行视觉处理和结果存储。到目前为止,这种方法还不能用于资源极其有限的小型机器人,如轻型无人机。受自然智能的简约性启发,我们提出了一种受昆虫启发的视觉导航方法,专门用于资源极其有限的机器人。这是一种路线跟踪方法,其中机器人的出境轨迹被存储为高度压缩的全景图像集合,以及用里程计测量的空间关系。在进站过程中,机器人结合使用轨迹测量和视觉归位来返回存储的位置,其中视觉归位可防止轨迹漂移的积累。所提策略的一个主要进步是,在测距精度允许的范围内,将存储的压缩图像间隔开来,从而最大限度地减少了压缩图像的数量。为了证明该策略适用于小型系统,我们在一架重 56 克的小型无人机上实施了该策略。该无人机可以成功跟踪长达 100 米的路线,其轨迹表示每米消耗的字节数不到 20 个。所提出的方法向微型机器人的自主视觉导航迈出了实质性的一步,促进了其更广泛的应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Science Robotics
Science Robotics Mathematics-Control and Optimization
CiteScore
30.60
自引率
2.80%
发文量
83
期刊介绍: Science Robotics publishes original, peer-reviewed, science- or engineering-based research articles that advance the field of robotics. The journal also features editor-commissioned Reviews. An international team of academic editors holds Science Robotics articles to the same high-quality standard that is the hallmark of the Science family of journals. Sub-topics include: actuators, advanced materials, artificial Intelligence, autonomous vehicles, bio-inspired design, exoskeletons, fabrication, field robotics, human-robot interaction, humanoids, industrial robotics, kinematics, machine learning, material science, medical technology, motion planning and control, micro- and nano-robotics, multi-robot control, sensors, service robotics, social and ethical issues, soft robotics, and space, planetary and undersea exploration.
期刊最新文献
A twist of the tail in turning maneuvers of bird-inspired drones Bird-inspired reflexive morphing enables rudderless flight Cybernetic avatars: Teleoperation technologies from in-body monitoring to social interaction. Robots and animals teaming up in the wild to tackle ecosystem challenges. NeuralFeels with neural fields: Visuotactile perception for in-hand manipulation
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1