在以自我为中心的视频中使用面向活动的RGB相机检测屏幕存在。

Amit Adate, Soroush Shahi, Rawan Alharbi, Sougata Sen, Yang Gao, Aggelos K Katsaggelos, Nabil Alshurafa
{"title":"在以自我为中心的视频中使用面向活动的RGB相机检测屏幕存在。","authors":"Amit Adate,&nbsp;Soroush Shahi,&nbsp;Rawan Alharbi,&nbsp;Sougata Sen,&nbsp;Yang Gao,&nbsp;Aggelos K Katsaggelos,&nbsp;Nabil Alshurafa","doi":"10.1109/percomworkshops53856.2022.9767433","DOIUrl":null,"url":null,"abstract":"<p><p>Screen time is associated with several health risk behaviors including mindless eating, sedentary behavior, and decreased academic performance. Screen time behavior is traditionally assessed with self-report measures, which are known to be burdensome, inaccurate, and imprecise. Recent methods to automatically detect screen time are geared more towards detecting television screens from wearable cameras that record high-resolution video. Activity-oriented wearable cameras (i.e., cameras oriented towards the wearer with a fisheye lens) have recently been designed and shown to reduce privacy concerns, yet pose a greater challenge in capturing screens due to their orientation and fewer pixels on target. Methods that detect screens from low-power, low-resolution wearable camera video are needed given the increased adoption of such devices in longitudinal studies. We propose a method that leverages deep learning algorithms and lower-resolution images from an activity-oriented camera to detect screen presence from multiple types of screens with high variability of pixel on target (e.g., near and far TV, smartphones, laptops, and tablets). We test our system in a real-world study comprising 10 individuals, 80 hours of data, and 1.2 million low-resolution RGB frames. Our results outperform existing state-of-the-art video screen detection methods yielding an F1-score of 81%. This paper demonstrates the potential for detecting screen-watching behavior in longitudinal studies using activity-oriented cameras, paving the way for a nuanced understanding of screen time's relationship with health risk behaviors.</p>","PeriodicalId":91950,"journal":{"name":"Proceedings of the ... IEEE International Conference on Pervasive Computing and Communications Workshops : PerCom ... IEEE International Conference on Pervasive Computing and Communications. Workshops","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9704366/pdf/nihms-1835828.pdf","citationCount":"1","resultStr":"{\"title\":\"Detecting Screen Presence with Activity-Oriented RGB Camera in Egocentric Videos.\",\"authors\":\"Amit Adate,&nbsp;Soroush Shahi,&nbsp;Rawan Alharbi,&nbsp;Sougata Sen,&nbsp;Yang Gao,&nbsp;Aggelos K Katsaggelos,&nbsp;Nabil Alshurafa\",\"doi\":\"10.1109/percomworkshops53856.2022.9767433\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Screen time is associated with several health risk behaviors including mindless eating, sedentary behavior, and decreased academic performance. Screen time behavior is traditionally assessed with self-report measures, which are known to be burdensome, inaccurate, and imprecise. Recent methods to automatically detect screen time are geared more towards detecting television screens from wearable cameras that record high-resolution video. Activity-oriented wearable cameras (i.e., cameras oriented towards the wearer with a fisheye lens) have recently been designed and shown to reduce privacy concerns, yet pose a greater challenge in capturing screens due to their orientation and fewer pixels on target. Methods that detect screens from low-power, low-resolution wearable camera video are needed given the increased adoption of such devices in longitudinal studies. We propose a method that leverages deep learning algorithms and lower-resolution images from an activity-oriented camera to detect screen presence from multiple types of screens with high variability of pixel on target (e.g., near and far TV, smartphones, laptops, and tablets). We test our system in a real-world study comprising 10 individuals, 80 hours of data, and 1.2 million low-resolution RGB frames. Our results outperform existing state-of-the-art video screen detection methods yielding an F1-score of 81%. This paper demonstrates the potential for detecting screen-watching behavior in longitudinal studies using activity-oriented cameras, paving the way for a nuanced understanding of screen time's relationship with health risk behaviors.</p>\",\"PeriodicalId\":91950,\"journal\":{\"name\":\"Proceedings of the ... IEEE International Conference on Pervasive Computing and Communications Workshops : PerCom ... IEEE International Conference on Pervasive Computing and Communications. Workshops\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9704366/pdf/nihms-1835828.pdf\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the ... IEEE International Conference on Pervasive Computing and Communications Workshops : PerCom ... IEEE International Conference on Pervasive Computing and Communications. Workshops\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/percomworkshops53856.2022.9767433\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2022/5/6 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ... IEEE International Conference on Pervasive Computing and Communications Workshops : PerCom ... IEEE International Conference on Pervasive Computing and Communications. Workshops","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/percomworkshops53856.2022.9767433","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/5/6 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

屏幕时间与几种健康风险行为有关,包括盲目进食、久坐行为和学习成绩下降。屏幕时间行为传统上是用自我报告的方法来评估的,众所周知,这种方法是繁琐的、不准确的和不精确的。最近自动检测屏幕时间的方法更倾向于从记录高分辨率视频的可穿戴相机中检测电视屏幕。面向活动的可穿戴相机(即面向佩戴者的带有鱼眼镜头的相机)最近被设计和展示,以减少隐私问题,但由于它们的方向和目标上的像素较少,在捕捉屏幕时构成了更大的挑战。考虑到在纵向研究中越来越多地采用这种设备,需要从低功耗、低分辨率可穿戴相机视频中检测屏幕的方法。我们提出了一种方法,该方法利用深度学习算法和来自面向活动的相机的低分辨率图像来检测来自目标像素高度可变性的多种类型屏幕的屏幕存在(例如,近距离和远距离电视,智能手机,笔记本电脑和平板电脑)。我们在一个真实世界的研究中测试了我们的系统,该研究包括10个人、80小时的数据和120万低分辨率RGB帧。我们的结果优于现有的最先进的视频屏幕检测方法,f1得分为81%。本文展示了使用活动导向摄像机在纵向研究中检测屏幕观看行为的潜力,为细致入微地理解屏幕时间与健康风险行为的关系铺平了道路。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

摘要图片

摘要图片

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Detecting Screen Presence with Activity-Oriented RGB Camera in Egocentric Videos.

Screen time is associated with several health risk behaviors including mindless eating, sedentary behavior, and decreased academic performance. Screen time behavior is traditionally assessed with self-report measures, which are known to be burdensome, inaccurate, and imprecise. Recent methods to automatically detect screen time are geared more towards detecting television screens from wearable cameras that record high-resolution video. Activity-oriented wearable cameras (i.e., cameras oriented towards the wearer with a fisheye lens) have recently been designed and shown to reduce privacy concerns, yet pose a greater challenge in capturing screens due to their orientation and fewer pixels on target. Methods that detect screens from low-power, low-resolution wearable camera video are needed given the increased adoption of such devices in longitudinal studies. We propose a method that leverages deep learning algorithms and lower-resolution images from an activity-oriented camera to detect screen presence from multiple types of screens with high variability of pixel on target (e.g., near and far TV, smartphones, laptops, and tablets). We test our system in a real-world study comprising 10 individuals, 80 hours of data, and 1.2 million low-resolution RGB frames. Our results outperform existing state-of-the-art video screen detection methods yielding an F1-score of 81%. This paper demonstrates the potential for detecting screen-watching behavior in longitudinal studies using activity-oriented cameras, paving the way for a nuanced understanding of screen time's relationship with health risk behaviors.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Impacts of Image Obfuscation on Fine-grained Activity Recognition in Egocentric Video. Detecting Screen Presence with Activity-Oriented RGB Camera in Egocentric Videos. ActivityAware: An App for Real-Time Daily Activity Level Monitoring on the Amulet Wrist-Worn Device. Measuring Changes in Gait and Vehicle Transfer Ability During Inpatient Rehabilitation with Wearable Inertial Sensors.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1