Eye-tracking Data for Weakly Supervised Object Detection

Ching-Hsi Tseng, Yen-Pin Hsu, S. Yuan
{"title":"Eye-tracking Data for Weakly Supervised Object Detection","authors":"Ching-Hsi Tseng, Yen-Pin Hsu, S. Yuan","doi":"10.1109/ECICE50847.2020.9301923","DOIUrl":null,"url":null,"abstract":"We propose a weakly supervised object detection network based on eye-tracking data. A large number of training samples cannot be used due to the following problems: (1) the labels of training samples in object detection are not all pixel-level and (2) the cost of labeling is too high. Thus, we introduce a framework whose input combines images with only image-level labels and eye-tracking data. Based on the position given by the eye-tracking data, the framework has effective performance even in the case of incomplete sample annotation. Thus, we use an eye-tracker to collect the data on the most interesting area in the sample images and present the data in the fixations way. Then, the bounding boxes produced by the fixations data and the original image-level label become the input data of the object detection network. In this way, eye-tracking data helps us selecting the bounding boxes and providing detailed location information. Experiment results verify that the framework is effective with the support of eye-tracking data.","PeriodicalId":130143,"journal":{"name":"2020 IEEE Eurasia Conference on IOT, Communication and Engineering (ECICE)","volume":"219 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Eurasia Conference on IOT, Communication and Engineering (ECICE)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ECICE50847.2020.9301923","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

We propose a weakly supervised object detection network based on eye-tracking data. A large number of training samples cannot be used due to the following problems: (1) the labels of training samples in object detection are not all pixel-level and (2) the cost of labeling is too high. Thus, we introduce a framework whose input combines images with only image-level labels and eye-tracking data. Based on the position given by the eye-tracking data, the framework has effective performance even in the case of incomplete sample annotation. Thus, we use an eye-tracker to collect the data on the most interesting area in the sample images and present the data in the fixations way. Then, the bounding boxes produced by the fixations data and the original image-level label become the input data of the object detection network. In this way, eye-tracking data helps us selecting the bounding boxes and providing detailed location information. Experiment results verify that the framework is effective with the support of eye-tracking data.
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
弱监督目标检测的眼动追踪数据
提出了一种基于眼动追踪数据的弱监督目标检测网络。由于以下问题,大量的训练样本无法使用:(1)目标检测中训练样本的标签并非都是像素级的;(2)标记成本太高。因此,我们引入了一个框架,其输入仅结合图像级标签和眼动追踪数据的图像。基于眼动数据给出的位置信息,该框架在样本标注不完全的情况下仍具有有效的性能。因此,我们使用眼动仪来收集样本图像中最有趣的区域的数据,并以注视的方式呈现数据。然后,由注视数据和原始图像级标签产生的边界框成为目标检测网络的输入数据。通过这种方式,眼动追踪数据可以帮助我们选择边界框并提供详细的位置信息。实验结果验证了该框架在眼动追踪数据支持下的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
3D Cameras and Algorithms for Multi-Angle Gripping and Control of Robotic Arm Real-Time Interaction System of Human-Robot with Hand Gestures A Smart Simulation System for Manufacturing Common Management Technology of Automation Equipment in Industry 4.0 Factors Affecting Location of Nasal Airway Obstruction
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1