多摄像头人脸跟踪估计用户对自动驾驶车辆操作的不适

Matthias Beggiato, Nadine Rauh, J. Krems
{"title":"多摄像头人脸跟踪估计用户对自动驾驶车辆操作的不适","authors":"Matthias Beggiato, Nadine Rauh, J. Krems","doi":"10.54941/ahfe1001105","DOIUrl":null,"url":null,"abstract":"Face tracking as innovative and unobtrusive sensor technology offers new possibilities for driver state monitoring regarding discomfort in automated driving. To explore the potential of automated facial expression analysis, video data of two driving simulator studies were analyzed using the Visage facial features and analysis software. A gender-balanced sample of 81 participants between 24 and 84 years took part in the studies. All participants were driven in highly automated mode on the same standardized track, consisting of three close approach situations to a truck driving ahead. By pressing the lever of a handset control, all participants could report perceived discomfort continuously. Tracking values for 23 facial action units were extracted from multiple video camera streams, z-transformed and averaged from 10 s before pressing the handset control until 10 s after to show changes over time. Results showed situation-related pressing and stretching of the lips, a pushback-movement of the head, raising of inner brows and upper lids as well as reduced eye closure. These patterns could be interpreted as visual attention, tension and surprise. Overall, there is potential of facial expression analysis for contributing information about users’ comfort with automated vehicle operations. However, effects became manifest on aggregated data level; obtaining stable and reliable results on individual level remains a challenging task.","PeriodicalId":116806,"journal":{"name":"Human Systems Engineering and Design (IHSED2021) Future Trends and Applications","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Multi-camera Face Tracking for Estimating User’s Discomfort with Automated Vehicle Operations\",\"authors\":\"Matthias Beggiato, Nadine Rauh, J. Krems\",\"doi\":\"10.54941/ahfe1001105\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Face tracking as innovative and unobtrusive sensor technology offers new possibilities for driver state monitoring regarding discomfort in automated driving. To explore the potential of automated facial expression analysis, video data of two driving simulator studies were analyzed using the Visage facial features and analysis software. A gender-balanced sample of 81 participants between 24 and 84 years took part in the studies. All participants were driven in highly automated mode on the same standardized track, consisting of three close approach situations to a truck driving ahead. By pressing the lever of a handset control, all participants could report perceived discomfort continuously. Tracking values for 23 facial action units were extracted from multiple video camera streams, z-transformed and averaged from 10 s before pressing the handset control until 10 s after to show changes over time. Results showed situation-related pressing and stretching of the lips, a pushback-movement of the head, raising of inner brows and upper lids as well as reduced eye closure. These patterns could be interpreted as visual attention, tension and surprise. Overall, there is potential of facial expression analysis for contributing information about users’ comfort with automated vehicle operations. However, effects became manifest on aggregated data level; obtaining stable and reliable results on individual level remains a challenging task.\",\"PeriodicalId\":116806,\"journal\":{\"name\":\"Human Systems Engineering and Design (IHSED2021) Future Trends and Applications\",\"volume\":\"9 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1900-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Human Systems Engineering and Design (IHSED2021) Future Trends and Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.54941/ahfe1001105\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Human Systems Engineering and Design (IHSED2021) Future Trends and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.54941/ahfe1001105","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

面部跟踪作为一种创新的、不引人注目的传感器技术,为自动驾驶中驾驶员状态监测提供了新的可能性。为了探索自动面部表情分析的潜力,使用Visage面部特征和分析软件对两次驾驶模拟器研究的视频数据进行了分析。81名年龄在24岁到84岁之间的参与者参与了这项研究。所有参与者都以高度自动化的模式在相同的标准化轨道上行驶,包括三种接近前方卡车的情况。通过按下手机控制杆,所有参与者都可以连续报告感知到的不适。从多个视频摄像机流中提取23个面部动作单元的跟踪值,进行z变换,并从按下手柄控制前10秒到按下手柄控制后10秒平均,以显示随时间的变化。结果显示,与情境相关的嘴唇按压和伸展,头部向后推,内眉和上眼睑上升以及闭眼减少。这些模式可以解释为视觉注意力、紧张和惊讶。总的来说,面部表情分析有潜力提供用户对自动驾驶汽车操作的舒适度信息。然而,影响在汇总数据水平上表现明显;在个体层面上获得稳定可靠的结果仍然是一项具有挑战性的任务。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
Multi-camera Face Tracking for Estimating User’s Discomfort with Automated Vehicle Operations
Face tracking as innovative and unobtrusive sensor technology offers new possibilities for driver state monitoring regarding discomfort in automated driving. To explore the potential of automated facial expression analysis, video data of two driving simulator studies were analyzed using the Visage facial features and analysis software. A gender-balanced sample of 81 participants between 24 and 84 years took part in the studies. All participants were driven in highly automated mode on the same standardized track, consisting of three close approach situations to a truck driving ahead. By pressing the lever of a handset control, all participants could report perceived discomfort continuously. Tracking values for 23 facial action units were extracted from multiple video camera streams, z-transformed and averaged from 10 s before pressing the handset control until 10 s after to show changes over time. Results showed situation-related pressing and stretching of the lips, a pushback-movement of the head, raising of inner brows and upper lids as well as reduced eye closure. These patterns could be interpreted as visual attention, tension and surprise. Overall, there is potential of facial expression analysis for contributing information about users’ comfort with automated vehicle operations. However, effects became manifest on aggregated data level; obtaining stable and reliable results on individual level remains a challenging task.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
期刊最新文献
Digitization of Pre-election Messages during the 2021 Parliamentary Campaign in Bulgaria Considering the need for new aspects in route planners Characterizing soft modes’ traveling in urban areas though indicators and simulated scenarios Design of an Exoskeleton to Prevent and to Take Care of the Spinal Column of Injuries of Low Back Pain OHS Management Skill Development and Continuing Learning
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1