{"title":"基于注视跟踪和注视闪烁的杂乱环境下目标分割","authors":"Ratsamee, Photchara, Mae, Yasushi, Kamiyama, Kazuto, Horade, Mitsuhiro, Kojima, Masaru, Arai, Tatsuo","doi":"10.1186/s40648-021-00214-4","DOIUrl":null,"url":null,"abstract":"People with disabilities, such as patients with motor paralysis conditions, lack independence and cannot move most parts of their bodies except for their eyes. Supportive robot technology is highly beneficial in supporting these types of patients. We propose a gaze-informed location-based (or gaze-based) object segmentation, which is a core module of successful patient-robot interaction in an object-search task (i.e., a situation when a robot has to search for and deliver a target object to the patient). We have introduced the concepts of gaze tracing (GT) and gaze blinking (GB), which are integrated into our proposed object segmentation technique, to yield the benefit of an accurate visual segmentation of unknown objects in a complex scene. Gaze tracing information can be used as a clue as to where the target object is located in a scene. Then, gaze blinking can be used to confirm the position of the target object. The effectiveness of our proposed method has been demonstrated using a humanoid robot in experiments with different types of highly cluttered scenes. Based on the limited gaze guidance from the user, we achieved an 85% F-score of unknown object segmentation in an unknown environment.","PeriodicalId":37462,"journal":{"name":"ROBOMECH Journal","volume":"108 10","pages":""},"PeriodicalIF":1.5000,"publicationDate":"2021-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Object segmentation in cluttered environment based on gaze tracing and gaze blinking\",\"authors\":\"Ratsamee, Photchara, Mae, Yasushi, Kamiyama, Kazuto, Horade, Mitsuhiro, Kojima, Masaru, Arai, Tatsuo\",\"doi\":\"10.1186/s40648-021-00214-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"People with disabilities, such as patients with motor paralysis conditions, lack independence and cannot move most parts of their bodies except for their eyes. Supportive robot technology is highly beneficial in supporting these types of patients. We propose a gaze-informed location-based (or gaze-based) object segmentation, which is a core module of successful patient-robot interaction in an object-search task (i.e., a situation when a robot has to search for and deliver a target object to the patient). We have introduced the concepts of gaze tracing (GT) and gaze blinking (GB), which are integrated into our proposed object segmentation technique, to yield the benefit of an accurate visual segmentation of unknown objects in a complex scene. Gaze tracing information can be used as a clue as to where the target object is located in a scene. Then, gaze blinking can be used to confirm the position of the target object. The effectiveness of our proposed method has been demonstrated using a humanoid robot in experiments with different types of highly cluttered scenes. Based on the limited gaze guidance from the user, we achieved an 85% F-score of unknown object segmentation in an unknown environment.\",\"PeriodicalId\":37462,\"journal\":{\"name\":\"ROBOMECH Journal\",\"volume\":\"108 10\",\"pages\":\"\"},\"PeriodicalIF\":1.5000,\"publicationDate\":\"2021-12-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ROBOMECH Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1186/s40648-021-00214-4\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"INSTRUMENTS & INSTRUMENTATION\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ROBOMECH Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1186/s40648-021-00214-4","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"INSTRUMENTS & INSTRUMENTATION","Score":null,"Total":0}
Object segmentation in cluttered environment based on gaze tracing and gaze blinking
People with disabilities, such as patients with motor paralysis conditions, lack independence and cannot move most parts of their bodies except for their eyes. Supportive robot technology is highly beneficial in supporting these types of patients. We propose a gaze-informed location-based (or gaze-based) object segmentation, which is a core module of successful patient-robot interaction in an object-search task (i.e., a situation when a robot has to search for and deliver a target object to the patient). We have introduced the concepts of gaze tracing (GT) and gaze blinking (GB), which are integrated into our proposed object segmentation technique, to yield the benefit of an accurate visual segmentation of unknown objects in a complex scene. Gaze tracing information can be used as a clue as to where the target object is located in a scene. Then, gaze blinking can be used to confirm the position of the target object. The effectiveness of our proposed method has been demonstrated using a humanoid robot in experiments with different types of highly cluttered scenes. Based on the limited gaze guidance from the user, we achieved an 85% F-score of unknown object segmentation in an unknown environment.
期刊介绍:
ROBOMECH Journal focuses on advanced technologies and practical applications in the field of Robotics and Mechatronics. This field is driven by the steadily growing research, development and consumer demand for robots and systems. Advanced robots have been working in medical and hazardous environments, such as space and the deep sea as well as in the manufacturing environment. The scope of the journal includes but is not limited to: 1. Modeling and design 2. System integration 3. Actuators and sensors 4. Intelligent control 5. Artificial intelligence 6. Machine learning 7. Robotics 8. Manufacturing 9. Motion control 10. Vibration and noise control 11. Micro/nano devices and optoelectronics systems 12. Automotive systems 13. Applications for extreme and/or hazardous environments 14. Other applications