A pipeline for estimating human attention toward objects with on-board cameras on the iCub humanoid robot.

IF 2.9 Q2 ROBOTICS Frontiers in Robotics and AI Pub Date : 2024-10-17 eCollection Date: 2024-01-01 DOI:10.3389/frobt.2024.1346714
Shiva Hanifi, Elisa Maiettini, Maria Lombardi, Lorenzo Natale
{"title":"A pipeline for estimating human attention toward objects with on-board cameras on the iCub humanoid robot.","authors":"Shiva Hanifi, Elisa Maiettini, Maria Lombardi, Lorenzo Natale","doi":"10.3389/frobt.2024.1346714","DOIUrl":null,"url":null,"abstract":"<p><p>This research report introduces a learning system designed to detect the object that humans are gazing at, using solely visual feedback. By incorporating face detection, human attention prediction, and online object detection, the system enables the robot to perceive and interpret human gaze accurately, thereby facilitating the establishment of joint attention with human partners. Additionally, a novel dataset collected with the humanoid robot iCub is introduced, comprising more than 22,000 images from ten participants gazing at different annotated objects. This dataset serves as a benchmark for human gaze estimation in table-top human-robot interaction (HRI) contexts. In this work, we use it to assess the proposed pipeline's performance and examine each component's effectiveness. Furthermore, the developed system is deployed on the iCub and showcases its functionality. The results demonstrate the potential of the proposed approach as a first step to enhancing social awareness and responsiveness in social robotics. This advancement can enhance assistance and support in collaborative scenarios, promoting more efficient human-robot collaborations.</p>","PeriodicalId":47597,"journal":{"name":"Frontiers in Robotics and AI","volume":null,"pages":null},"PeriodicalIF":2.9000,"publicationDate":"2024-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11524796/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Robotics and AI","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3389/frobt.2024.1346714","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

This research report introduces a learning system designed to detect the object that humans are gazing at, using solely visual feedback. By incorporating face detection, human attention prediction, and online object detection, the system enables the robot to perceive and interpret human gaze accurately, thereby facilitating the establishment of joint attention with human partners. Additionally, a novel dataset collected with the humanoid robot iCub is introduced, comprising more than 22,000 images from ten participants gazing at different annotated objects. This dataset serves as a benchmark for human gaze estimation in table-top human-robot interaction (HRI) contexts. In this work, we use it to assess the proposed pipeline's performance and examine each component's effectiveness. Furthermore, the developed system is deployed on the iCub and showcases its functionality. The results demonstrate the potential of the proposed approach as a first step to enhancing social awareness and responsiveness in social robotics. This advancement can enhance assistance and support in collaborative scenarios, promoting more efficient human-robot collaborations.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用 iCub 人形机器人上的板载摄像头估算人类对物体注意力的管道。
本研究报告介绍了一种学习系统,旨在仅利用视觉反馈来检测人类注视的对象。通过结合人脸检测、人类注意力预测和在线物体检测,该系统使机器人能够准确感知和解读人类的注视,从而促进与人类伙伴建立联合注意力。此外,还介绍了利用仿人机器人 iCub 收集的新数据集,其中包括十名参与者凝视不同注释对象的 22,000 多张图像。该数据集是桌面人机交互(HRI)环境中人类注视估计的基准。在这项工作中,我们用它来评估所提出的管道性能,并检查每个组件的有效性。此外,我们还在 iCub 上部署了所开发的系统,并展示了其功能。结果表明,作为增强社交机器人的社交意识和响应能力的第一步,所提出的方法具有很大的潜力。这一进步可以加强协作场景中的援助和支持,促进更高效的人机协作。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
CiteScore
6.50
自引率
5.90%
发文量
355
审稿时长
14 weeks
期刊介绍: Frontiers in Robotics and AI publishes rigorously peer-reviewed research covering all theory and applications of robotics, technology, and artificial intelligence, from biomedical to space robotics.
期刊最新文献
Cybernic robot hand-arm that realizes cooperative work as a new hand-arm for people with a single upper-limb dysfunction. Advancements in the use of AI in the diagnosis and management of inflammatory bowel disease. Remote science at sea with remotely operated vehicles. A pipeline for estimating human attention toward objects with on-board cameras on the iCub humanoid robot. Leveraging imitation learning in agricultural robotics: a comprehensive survey and comparative analysis.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1