走向实时头部检测和凝视估计

Kunxu Zhao, Zhengxi Hu, Qianyi Zhang, Jingtai Liu
{"title":"走向实时头部检测和凝视估计","authors":"Kunxu Zhao, Zhengxi Hu, Qianyi Zhang, Jingtai Liu","doi":"10.1109/ROBIO55434.2022.10011802","DOIUrl":null,"url":null,"abstract":"As an important way of understanding human in-tentions, gaze estimation has always been a research hotspot in the field of human-robot interaction. Most studies now estimate gaze direction by analyzing head features and head detection is required before gaze estimation. For these two sequential tasks, the current research usually adopts two different networks, which increases the memory occupation of the graphics card and is not easy to deploy on the edge device. In this paper, we propose a unified network for simultaneous head detection and gaze estimation, unifying these two tasks into a multi-task learning model. In this network framework, head detection and gaze estimation share the same set of features, which enables them to promote each other to improve detection accuracy. We evaluated our model on gaze360 dataset and the gaze error dropped to 19.62 degrees while running at 23 fps.","PeriodicalId":151112,"journal":{"name":"2022 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"RTHG: Towards Real- Time Head Detection And Gaze Estimation\",\"authors\":\"Kunxu Zhao, Zhengxi Hu, Qianyi Zhang, Jingtai Liu\",\"doi\":\"10.1109/ROBIO55434.2022.10011802\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"As an important way of understanding human in-tentions, gaze estimation has always been a research hotspot in the field of human-robot interaction. Most studies now estimate gaze direction by analyzing head features and head detection is required before gaze estimation. For these two sequential tasks, the current research usually adopts two different networks, which increases the memory occupation of the graphics card and is not easy to deploy on the edge device. In this paper, we propose a unified network for simultaneous head detection and gaze estimation, unifying these two tasks into a multi-task learning model. In this network framework, head detection and gaze estimation share the same set of features, which enables them to promote each other to improve detection accuracy. We evaluated our model on gaze360 dataset and the gaze error dropped to 19.62 degrees while running at 23 fps.\",\"PeriodicalId\":151112,\"journal\":{\"name\":\"2022 IEEE International Conference on Robotics and Biomimetics (ROBIO)\",\"volume\":\"24 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Robotics and Biomimetics (ROBIO)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ROBIO55434.2022.10011802\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Robotics and Biomimetics (ROBIO)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ROBIO55434.2022.10011802","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

注视估计作为理解人类意图的重要途径,一直是人机交互领域的研究热点。目前大多数研究都是通过分析头部特征来估计凝视方向,而在进行凝视估计之前需要对头部进行检测。对于这两个顺序的任务,目前的研究通常采用两种不同的网络,这增加了显卡的内存占用,并且不容易部署在边缘设备上。在本文中,我们提出了一个同时进行头部检测和凝视估计的统一网络,将这两个任务统一为一个多任务学习模型。在该网络框架中,头部检测和凝视估计共享同一组特征,使两者相互促进,提高检测精度。我们在gaze360数据集上评估了我们的模型,当以23 fps运行时,凝视误差下降到19.62度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
RTHG: Towards Real- Time Head Detection And Gaze Estimation
As an important way of understanding human in-tentions, gaze estimation has always been a research hotspot in the field of human-robot interaction. Most studies now estimate gaze direction by analyzing head features and head detection is required before gaze estimation. For these two sequential tasks, the current research usually adopts two different networks, which increases the memory occupation of the graphics card and is not easy to deploy on the edge device. In this paper, we propose a unified network for simultaneous head detection and gaze estimation, unifying these two tasks into a multi-task learning model. In this network framework, head detection and gaze estimation share the same set of features, which enables them to promote each other to improve detection accuracy. We evaluated our model on gaze360 dataset and the gaze error dropped to 19.62 degrees while running at 23 fps.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Relative Displacement Measurement Based Affine Formation Tracking Control for Nonholonomic Kinematic Agents Steady Tracker: Tracking a Target Stably Using a Quadrotor Adaptive Super-Twisting sliding mode trajectory tracking control of underactuated unmanned surface vehicles based on prescribed performance* Design and Preliminary Evaluation of a Lightweight, Cable-Driven Hip Exoskeleton for Walking Assistance A PSO-based Resource Allocation and Task Assignment Approach for Real-Time Cloud Computing-based Robotic Systems
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1