A multimodal execution monitor with anomaly classification for robot-assisted feeding

Daehyung Park, Hokeun Kim, Yuuna Hoshi, Zackory M. Erickson, Ariel Kapusta, C. Kemp
{"title":"A multimodal execution monitor with anomaly classification for robot-assisted feeding","authors":"Daehyung Park, Hokeun Kim, Yuuna Hoshi, Zackory M. Erickson, Ariel Kapusta, C. Kemp","doi":"10.1109/IROS.2017.8206437","DOIUrl":null,"url":null,"abstract":"Activities of daily living (ADLs) are important for quality of life. Robotic assistance offers the opportunity for people with disabilities to perform ADLs on their own. However, when a complex semi-autonomous system provides real-world assistance, occasional anomalies are likely to occur. Robots that can detect, classify and respond appropriately to common anomalies have the potential to provide more effective and safer assistance. We introduce a multimodal execution monitor to detect and classify anomalous executions when robots operate near humans. Our system builds on our past work on multimodal anomaly detection. Our new monitor classifies the type and cause of common anomalies using an artificial neural network. We implemented and evaluated our execution monitor in the context of robot-assisted feeding with a general-purpose mobile manipulator. In our evaluations, our monitor outperformed baseline methods from the literature. It succeeded in detecting 12 common anomalies from 8 able-bodied participants with 83% accuracy and classifying the types and causes of the detected anomalies with 90% and 81% accuracies, respectively. We then performed an in-home evaluation with Henry Evans, a person with severe quadriplegia. With our system, Henry successfully fed himself while the monitor detected, classified the types, and classified the causes of anomalies with 86%, 90%, and 54% accuracy, respectively.","PeriodicalId":6658,"journal":{"name":"2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","volume":"57 1","pages":"5406-5413"},"PeriodicalIF":0.0000,"publicationDate":"2017-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"51","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IROS.2017.8206437","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 51

Abstract

Activities of daily living (ADLs) are important for quality of life. Robotic assistance offers the opportunity for people with disabilities to perform ADLs on their own. However, when a complex semi-autonomous system provides real-world assistance, occasional anomalies are likely to occur. Robots that can detect, classify and respond appropriately to common anomalies have the potential to provide more effective and safer assistance. We introduce a multimodal execution monitor to detect and classify anomalous executions when robots operate near humans. Our system builds on our past work on multimodal anomaly detection. Our new monitor classifies the type and cause of common anomalies using an artificial neural network. We implemented and evaluated our execution monitor in the context of robot-assisted feeding with a general-purpose mobile manipulator. In our evaluations, our monitor outperformed baseline methods from the literature. It succeeded in detecting 12 common anomalies from 8 able-bodied participants with 83% accuracy and classifying the types and causes of the detected anomalies with 90% and 81% accuracies, respectively. We then performed an in-home evaluation with Henry Evans, a person with severe quadriplegia. With our system, Henry successfully fed himself while the monitor detected, classified the types, and classified the causes of anomalies with 86%, 90%, and 54% accuracy, respectively.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
用于机器人辅助喂养的多模态异常分类执行监视器
日常生活活动(ADLs)对生活质量很重要。机器人辅助为残疾人提供了自己执行adl的机会。然而,当一个复杂的半自主系统提供现实世界的帮助时,偶尔可能会发生异常情况。能够检测、分类并对常见异常做出适当反应的机器人有可能提供更有效、更安全的帮助。我们引入了一个多模态执行监视器来检测和分类当机器人在人类附近操作时的异常执行。我们的系统建立在我们过去在多模态异常检测方面的工作基础上。我们的新监视器使用人工神经网络对常见异常的类型和原因进行分类。我们在机器人辅助喂养的背景下实现并评估了我们的执行监测器。在我们的评估中,我们的监测仪优于文献中的基线方法。从8名健全参与者中检测出12种常见异常,准确率为83%,对检测到的异常类型和原因进行分类的准确率分别为90%和81%。然后我们对亨利·埃文斯进行了家庭评估,他是一个严重四肢瘫痪的人。在我们的系统中,亨利成功地喂食了自己,同时监视器检测,分类类型,并分类异常原因,准确率分别为86%,90%和54%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Direct visual SLAM fusing proprioception for a humanoid robot Upper limb motion intent recognition using tactile sensing Soft fluidic rotary actuator with improved actuation properties Underwater 3D structures as semantic landmarks in SONAR mapping Adaptive perception: Learning from sensory predictions to extract object shape with a biomimetic fingertip
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1