Development and Mobile Deployment of a Stair Recognition System for Human–Robot Locomotion

IF 3.4 Q2 ENGINEERING, BIOMEDICAL IEEE transactions on medical robotics and bionics Pub Date : 2024-01-04 DOI:10.1109/TMRB.2024.3349602
Andrew Garrett Kurbis;Alex Mihailidis;Brokoslaw Laschowski
{"title":"Development and Mobile Deployment of a Stair Recognition System for Human–Robot Locomotion","authors":"Andrew Garrett Kurbis;Alex Mihailidis;Brokoslaw Laschowski","doi":"10.1109/TMRB.2024.3349602","DOIUrl":null,"url":null,"abstract":"Environment sensing and recognition can improve the safety and autonomy of human-robot locomotion, especially during transitions between environmental states such as walking to and from stairs. However, accurate and real-time perception on edge devices with limited computational resources is an open problem. Here we present the development and mobile deployment of StairNet, a vision-based stair recognition system powered by deep learning. Building on ExoNet, the largest open-source dataset of egocentric images of real-world walking environments, we designed a new dataset for stair recognition with over 515,000 images. We trained a lightweight and efficient convolutional neural network for image classification, which predicted complex stair environments with 98.4% accuracy. We also studied different model compression optimization methods and deployed our system on several mobile devices running a custom-designed iOS application with onboard accelerators using CPU, GPU, and/or NPU backend computing. Of the designs that we studied, our highest performing system showed negligible reductions in classification accuracy due to model conversion for mobile deployment and achieved an inference time of 2.75 ms. The high speed and accuracy of StairNet on edge devices opens new opportunities for environment-adaptive control of robotic prosthetic legs, exoskeletons, and other assistive technologies for human locomotion.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":null,"pages":null},"PeriodicalIF":3.4000,"publicationDate":"2024-01-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical robotics and bionics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10380751/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Environment sensing and recognition can improve the safety and autonomy of human-robot locomotion, especially during transitions between environmental states such as walking to and from stairs. However, accurate and real-time perception on edge devices with limited computational resources is an open problem. Here we present the development and mobile deployment of StairNet, a vision-based stair recognition system powered by deep learning. Building on ExoNet, the largest open-source dataset of egocentric images of real-world walking environments, we designed a new dataset for stair recognition with over 515,000 images. We trained a lightweight and efficient convolutional neural network for image classification, which predicted complex stair environments with 98.4% accuracy. We also studied different model compression optimization methods and deployed our system on several mobile devices running a custom-designed iOS application with onboard accelerators using CPU, GPU, and/or NPU backend computing. Of the designs that we studied, our highest performing system showed negligible reductions in classification accuracy due to model conversion for mobile deployment and achieved an inference time of 2.75 ms. The high speed and accuracy of StairNet on edge devices opens new opportunities for environment-adaptive control of robotic prosthetic legs, exoskeletons, and other assistive technologies for human locomotion.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
开发和移动部署用于人机运动的楼梯识别系统
环境感知和识别可以提高人机运动的安全性和自主性,尤其是在环境状态之间的转换过程中,例如上下楼梯。然而,在计算资源有限的边缘设备上实现准确、实时的感知是一个尚未解决的问题。在此,我们介绍了基于深度学习的视觉楼梯识别系统 StairNet 的开发和移动部署情况。ExoNet 是现实世界步行环境中最大的以自我为中心的图像开源数据集,在此基础上,我们设计了一个包含 515,000 多张图像的新楼梯识别数据集。我们训练了一个用于图像分类的轻量级高效卷积神经网络,该网络预测复杂楼梯环境的准确率高达 98.4%。我们还研究了不同的模型压缩优化方法,并在几款移动设备上部署了我们的系统,这些设备运行的是定制设计的 iOS 应用程序,该应用程序带有使用 CPU、GPU 和/或 NPU 后端计算的板载加速器。在我们所研究的设计中,性能最高的系统在移动部署中由于模型转换而导致的分类准确性降低可以忽略不计,推理时间仅为 2.75 毫秒。StairNet 在边缘设备上的高速度和高精确度为机器人假肢、外骨骼和其他人类运动辅助技术的环境适应性控制带来了新的机遇。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
6.80
自引率
0.00%
发文量
0
期刊最新文献
Table of Contents IEEE Transactions on Medical Robotics and Bionics Publication Information Guest Editorial Joining Efforts Moving Faster in Surgical Robotics IEEE Transactions on Medical Robotics and Bionics Society Information IEEE Transactions on Medical Robotics and Bionics Information for Authors
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1