基于注视跟踪和注视闪烁的杂乱环境下目标分割

IF 1.5 Q3 INSTRUMENTS & INSTRUMENTATION ROBOMECH Journal Pub Date : 2021-12-22 DOI:10.1186/s40648-021-00214-4
Ratsamee, Photchara, Mae, Yasushi, Kamiyama, Kazuto, Horade, Mitsuhiro, Kojima, Masaru, Arai, Tatsuo
{"title":"基于注视跟踪和注视闪烁的杂乱环境下目标分割","authors":"Ratsamee, Photchara, Mae, Yasushi, Kamiyama, Kazuto, Horade, Mitsuhiro, Kojima, Masaru, Arai, Tatsuo","doi":"10.1186/s40648-021-00214-4","DOIUrl":null,"url":null,"abstract":"People with disabilities, such as patients with motor paralysis conditions, lack independence and cannot move most parts of their bodies except for their eyes. Supportive robot technology is highly beneficial in supporting these types of patients. We propose a gaze-informed location-based (or gaze-based) object segmentation, which is a core module of successful patient-robot interaction in an object-search task (i.e., a situation when a robot has to search for and deliver a target object to the patient). We have introduced the concepts of gaze tracing (GT) and gaze blinking (GB), which are integrated into our proposed object segmentation technique, to yield the benefit of an accurate visual segmentation of unknown objects in a complex scene. Gaze tracing information can be used as a clue as to where the target object is located in a scene. Then, gaze blinking can be used to confirm the position of the target object. The effectiveness of our proposed method has been demonstrated using a humanoid robot in experiments with different types of highly cluttered scenes. Based on the limited gaze guidance from the user, we achieved an 85% F-score of unknown object segmentation in an unknown environment.","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"108 10","pages":""},"PeriodicalIF":1.5000,"publicationDate":"2021-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Object segmentation in cluttered environment based on gaze tracing and gaze blinking\",\"authors\":\"Ratsamee, Photchara, Mae, Yasushi, Kamiyama, Kazuto, Horade, Mitsuhiro, Kojima, Masaru, Arai, Tatsuo\",\"doi\":\"10.1186/s40648-021-00214-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"People with disabilities, such as patients with motor paralysis conditions, lack independence and cannot move most parts of their bodies except for their eyes. Supportive robot technology is highly beneficial in supporting these types of patients. We propose a gaze-informed location-based (or gaze-based) object segmentation, which is a core module of successful patient-robot interaction in an object-search task (i.e., a situation when a robot has to search for and deliver a target object to the patient). We have introduced the concepts of gaze tracing (GT) and gaze blinking (GB), which are integrated into our proposed object segmentation technique, to yield the benefit of an accurate visual segmentation of unknown objects in a complex scene. Gaze tracing information can be used as a clue as to where the target object is located in a scene. Then, gaze blinking can be used to confirm the position of the target object. The effectiveness of our proposed method has been demonstrated using a humanoid robot in experiments with different types of highly cluttered scenes. Based on the limited gaze guidance from the user, we achieved an 85% F-score of unknown object segmentation in an unknown environment.\",\"PeriodicalId\":37462,\"journal\":{\"name\":\"ROBOMECH Journal\",\"volume\":\"108 10\",\"pages\":\"\"},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2021-12-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ROBOMECH Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1186/s40648-021-00214-4\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"INSTRUMENTS & INSTRUMENTATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ROBOMECH Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1186/s40648-021-00214-4","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
引用次数: 0

摘要

残疾人,如运动瘫痪患者,缺乏独立性,除了眼睛以外,身体的大部分都不能活动。辅助机器人技术在支持这些类型的患者方面非常有益。我们提出了一种基于注视信息位置(或基于注视)的对象分割,这是对象搜索任务(即机器人必须搜索并向患者传递目标对象的情况)中成功的患者-机器人交互的核心模块。我们引入了凝视跟踪(GT)和凝视闪烁(GB)的概念,并将其集成到我们提出的目标分割技术中,从而在复杂场景中对未知物体进行准确的视觉分割。凝视跟踪信息可以作为线索,以确定目标物体在场景中的位置。然后,凝视闪烁可以用来确认目标物体的位置。我们提出的方法的有效性已经证明了在实验中使用人形机器人与不同类型的高度混乱的场景。基于用户有限的注视引导,我们在未知环境中实现了85%的未知目标分割f值。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Object segmentation in cluttered environment based on gaze tracing and gaze blinking
People with disabilities, such as patients with motor paralysis conditions, lack independence and cannot move most parts of their bodies except for their eyes. Supportive robot technology is highly beneficial in supporting these types of patients. We propose a gaze-informed location-based (or gaze-based) object segmentation, which is a core module of successful patient-robot interaction in an object-search task (i.e., a situation when a robot has to search for and deliver a target object to the patient). We have introduced the concepts of gaze tracing (GT) and gaze blinking (GB), which are integrated into our proposed object segmentation technique, to yield the benefit of an accurate visual segmentation of unknown objects in a complex scene. Gaze tracing information can be used as a clue as to where the target object is located in a scene. Then, gaze blinking can be used to confirm the position of the target object. The effectiveness of our proposed method has been demonstrated using a humanoid robot in experiments with different types of highly cluttered scenes. Based on the limited gaze guidance from the user, we achieved an 85% F-score of unknown object segmentation in an unknown environment.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ROBOMECH Journal
ROBOMECH Journal Mathematics-Control and Optimization
CiteScore
3.20
自引率
7.10%
发文量
21
审稿时长
13 weeks
期刊介绍: ROBOMECH Journal focuses on advanced technologies and practical applications in the field of Robotics and Mechatronics. This field is driven by the steadily growing research, development and consumer demand for robots and systems. Advanced robots have been working in medical and hazardous environments, such as space and the deep sea as well as in the manufacturing environment. The scope of the journal includes but is not limited to: 1. Modeling and design 2. System integration 3. Actuators and sensors 4. Intelligent control 5. Artificial intelligence 6. Machine learning 7. Robotics 8. Manufacturing 9. Motion control 10. Vibration and noise control 11. Micro/nano devices and optoelectronics systems 12. Automotive systems 13. Applications for extreme and/or hazardous environments 14. Other applications
期刊最新文献
Computer vision-based visualization and quantification of body skeletal movements for investigation of traditional skills: the production of Kizumi winnowing baskets Measuring unit for synchronously collecting air dose rate and measurement position Length control of a McKibben pneumatic actuator using a dynamic quantizer Interactive driving of electrostatic film actuator by proximity motion of human body Development and flight-test verification of two-dimensional rotational low-airspeed sensor for small helicopters
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1