DERNet:使用车载摄像头识别驾驶员情绪

IF 4.3 3区 工程技术 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC IEEE Intelligent Transportation Systems Magazine Pub Date : 2023-12-07 DOI:10.1109/mits.2023.3333882
Dingyu Wang, Shaocheng Jia, Xin Pei, Chunyang Han, Danya Yao, Dezhi Liu
{"title":"DERNet:使用车载摄像头识别驾驶员情绪","authors":"Dingyu Wang, Shaocheng Jia, Xin Pei, Chunyang Han, Danya Yao, Dezhi Liu","doi":"10.1109/mits.2023.3333882","DOIUrl":null,"url":null,"abstract":"Driver emotion is considered an essential factor associated with driving behaviors and thus influences traffic safety. Dynamically and accurately recognizing the emotions of drivers plays an important role in road safety, especially for professional drivers, e.g., the drivers of passenger service vehicles. However, there is a lack of a benchmark to quantitatively evaluate the performance of driver emotion recognition performance, especially for various application situations. In this article, we propose an emotion recognition benchmark based on the driver emotion facial expression (DEFE) dataset, which consists of two splits: training and testing on the same set (split 1) and different sets (split 2) of drivers. These two splits correspond to various application scenarios and have diverse challenges. For the former, a driver emotion recognition network is proposed to provide a competitive baseline for the benchmark. For the latter, a novel driver representation difference minimization loss is proposed to enhance the learning of common representations for emotion recognition over different drivers. Moreover, the minimum required information for achieving a satisfactory performance is also explored on split 2. Comprehensive experiments on the DEFE dataset clearly demonstrate the superiority of the proposed methods compared to other state-of-the-art methods. An example application of applying the proposed methods and a voting mechanism to real-world data collected in a naturalistic environment reveals the strong practicality and readiness of the proposed methods. The codes and dataset splits are publicly available at <uri xmlns:mml=\"http://www.w3.org/1998/Math/MathML\" xmlns:xlink=\"http://www.w3.org/1999/xlink\">https://github.com/wdy806/CDERNet/</uri>\n.","PeriodicalId":48826,"journal":{"name":"IEEE Intelligent Transportation Systems Magazine","volume":null,"pages":null},"PeriodicalIF":4.3000,"publicationDate":"2023-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"DERNet: Driver Emotion Recognition Using Onboard Camera\",\"authors\":\"Dingyu Wang, Shaocheng Jia, Xin Pei, Chunyang Han, Danya Yao, Dezhi Liu\",\"doi\":\"10.1109/mits.2023.3333882\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Driver emotion is considered an essential factor associated with driving behaviors and thus influences traffic safety. Dynamically and accurately recognizing the emotions of drivers plays an important role in road safety, especially for professional drivers, e.g., the drivers of passenger service vehicles. However, there is a lack of a benchmark to quantitatively evaluate the performance of driver emotion recognition performance, especially for various application situations. In this article, we propose an emotion recognition benchmark based on the driver emotion facial expression (DEFE) dataset, which consists of two splits: training and testing on the same set (split 1) and different sets (split 2) of drivers. These two splits correspond to various application scenarios and have diverse challenges. For the former, a driver emotion recognition network is proposed to provide a competitive baseline for the benchmark. For the latter, a novel driver representation difference minimization loss is proposed to enhance the learning of common representations for emotion recognition over different drivers. Moreover, the minimum required information for achieving a satisfactory performance is also explored on split 2. Comprehensive experiments on the DEFE dataset clearly demonstrate the superiority of the proposed methods compared to other state-of-the-art methods. An example application of applying the proposed methods and a voting mechanism to real-world data collected in a naturalistic environment reveals the strong practicality and readiness of the proposed methods. The codes and dataset splits are publicly available at <uri xmlns:mml=\\\"http://www.w3.org/1998/Math/MathML\\\" xmlns:xlink=\\\"http://www.w3.org/1999/xlink\\\">https://github.com/wdy806/CDERNet/</uri>\\n.\",\"PeriodicalId\":48826,\"journal\":{\"name\":\"IEEE Intelligent Transportation Systems Magazine\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.3000,\"publicationDate\":\"2023-12-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Intelligent Transportation Systems Magazine\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1109/mits.2023.3333882\",\"RegionNum\":3,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Intelligent Transportation Systems Magazine","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1109/mits.2023.3333882","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

驾驶员的情绪被认为是与驾驶行为相关的重要因素,从而影响交通安全。动态、准确地识别驾驶员的情绪在道路安全中发挥着重要作用,尤其是对于职业驾驶员,如客运服务车辆的驾驶员。然而,目前还缺乏一个基准来定量评估驾驶员情绪识别的性能,尤其是在各种应用情况下。在本文中,我们提出了一种基于驾驶员情绪面部表情(DEFE)数据集的情绪识别基准,该数据集由两部分组成:对同一组(第一部分)和不同一组(第二部分)驾驶员的训练和测试。这两组数据分别对应不同的应用场景和挑战。对于前者,提出了一个驾驶员情绪识别网络,为基准提供了一个有竞争力的基线。对于后者,提出了一种新颖的驾驶员表征差异最小化损失,以加强对不同驾驶员情感识别通用表征的学习。此外,还探讨了在分裂 2 上取得令人满意的性能所需的最低信息量。在 DEFE 数据集上进行的综合实验清楚地表明,与其他最先进的方法相比,所提出的方法更具优势。在自然环境中收集的真实世界数据中应用所提出的方法和投票机制的示例应用,揭示了所提出的方法具有很强的实用性和就绪性。代码和数据集拆分可在 https://github.com/wdy806/CDERNet/ 上公开获取。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
DERNet: Driver Emotion Recognition Using Onboard Camera
Driver emotion is considered an essential factor associated with driving behaviors and thus influences traffic safety. Dynamically and accurately recognizing the emotions of drivers plays an important role in road safety, especially for professional drivers, e.g., the drivers of passenger service vehicles. However, there is a lack of a benchmark to quantitatively evaluate the performance of driver emotion recognition performance, especially for various application situations. In this article, we propose an emotion recognition benchmark based on the driver emotion facial expression (DEFE) dataset, which consists of two splits: training and testing on the same set (split 1) and different sets (split 2) of drivers. These two splits correspond to various application scenarios and have diverse challenges. For the former, a driver emotion recognition network is proposed to provide a competitive baseline for the benchmark. For the latter, a novel driver representation difference minimization loss is proposed to enhance the learning of common representations for emotion recognition over different drivers. Moreover, the minimum required information for achieving a satisfactory performance is also explored on split 2. Comprehensive experiments on the DEFE dataset clearly demonstrate the superiority of the proposed methods compared to other state-of-the-art methods. An example application of applying the proposed methods and a voting mechanism to real-world data collected in a naturalistic environment reveals the strong practicality and readiness of the proposed methods. The codes and dataset splits are publicly available at https://github.com/wdy806/CDERNet/ .
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Intelligent Transportation Systems Magazine
IEEE Intelligent Transportation Systems Magazine ENGINEERING, ELECTRICAL & ELECTRONIC-TRANSPORTATION SCIENCE & TECHNOLOGY
CiteScore
8.00
自引率
8.30%
发文量
147
期刊介绍: The IEEE Intelligent Transportation Systems Magazine (ITSM) publishes peer-reviewed articles that provide innovative research ideas and application results, report significant application case studies, and raise awareness of pressing research and application challenges in all areas of intelligent transportation systems. In contrast to the highly academic publication of the IEEE Transactions on Intelligent Transportation Systems, the ITS Magazine focuses on providing needed information to all members of IEEE ITS society, serving as a dissemination vehicle for ITS Society members and the others to learn the state of the art development and progress on ITS research and applications. High quality tutorials, surveys, successful implementations, technology reviews, lessons learned, policy and societal impacts, and ITS educational issues are published as well. The ITS Magazine also serves as an ideal media communication vehicle between the governing body of ITS society and its membership and promotes ITS community development and growth.
期刊最新文献
En Route Congestion Prediction Method for Air Route Network Based on Spatiotemporal Graph Convolution Network and Attention Coordinated Operation Strategy Design for Virtually Coupled Train Set: A Multiagent Reinforcement Learning Approach Distributed Automation System Lab at University of Naples Federico II [Its Research Lab] Calendar [Calendar] IEEE APP
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1