Andrew Garrett Kurbis;Alex Mihailidis;Brokoslaw Laschowski
{"title":"Development and Mobile Deployment of a Stair Recognition System for Human–Robot Locomotion","authors":"Andrew Garrett Kurbis;Alex Mihailidis;Brokoslaw Laschowski","doi":"10.1109/TMRB.2024.3349602","DOIUrl":null,"url":null,"abstract":"Environment sensing and recognition can improve the safety and autonomy of human-robot locomotion, especially during transitions between environmental states such as walking to and from stairs. However, accurate and real-time perception on edge devices with limited computational resources is an open problem. Here we present the development and mobile deployment of StairNet, a vision-based stair recognition system powered by deep learning. Building on ExoNet, the largest open-source dataset of egocentric images of real-world walking environments, we designed a new dataset for stair recognition with over 515,000 images. We trained a lightweight and efficient convolutional neural network for image classification, which predicted complex stair environments with 98.4% accuracy. We also studied different model compression optimization methods and deployed our system on several mobile devices running a custom-designed iOS application with onboard accelerators using CPU, GPU, and/or NPU backend computing. Of the designs that we studied, our highest performing system showed negligible reductions in classification accuracy due to model conversion for mobile deployment and achieved an inference time of 2.75 ms. The high speed and accuracy of StairNet on edge devices opens new opportunities for environment-adaptive control of robotic prosthetic legs, exoskeletons, and other assistive technologies for human locomotion.","PeriodicalId":73318,"journal":{"name":"IEEE transactions on medical robotics and bionics","volume":null,"pages":null},"PeriodicalIF":3.4000,"publicationDate":"2024-01-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on medical robotics and bionics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10380751/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, BIOMEDICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Environment sensing and recognition can improve the safety and autonomy of human-robot locomotion, especially during transitions between environmental states such as walking to and from stairs. However, accurate and real-time perception on edge devices with limited computational resources is an open problem. Here we present the development and mobile deployment of StairNet, a vision-based stair recognition system powered by deep learning. Building on ExoNet, the largest open-source dataset of egocentric images of real-world walking environments, we designed a new dataset for stair recognition with over 515,000 images. We trained a lightweight and efficient convolutional neural network for image classification, which predicted complex stair environments with 98.4% accuracy. We also studied different model compression optimization methods and deployed our system on several mobile devices running a custom-designed iOS application with onboard accelerators using CPU, GPU, and/or NPU backend computing. Of the designs that we studied, our highest performing system showed negligible reductions in classification accuracy due to model conversion for mobile deployment and achieved an inference time of 2.75 ms. The high speed and accuracy of StairNet on edge devices opens new opportunities for environment-adaptive control of robotic prosthetic legs, exoskeletons, and other assistive technologies for human locomotion.