{"title":"Detecting event-related driving anger with facial features captured by smartphones.","authors":"Yi Wang, Xin Zhou, Yang Yang, Wei Zhang","doi":"10.1080/00140139.2024.2418303","DOIUrl":null,"url":null,"abstract":"<p><p>Driving anger is a serious global issue that poses risks to road safety, thus necessitating the development of effective detection and intervention methods. This study investigated the feasibility of using smartphones to capture facial expressions to detect event-related driving anger. Sixty drivers completed the driving tasks in scenarios with and without multi-stage road events and were induced to angry and neutral states, respectively. Their physiological signals, facial expressions, and subjective data were collected. Four feature combinations and six machine learning algorithms were used to construct driving anger detection models. The model combining facial features and the XGBoost algorithm outperformed models using physiological features or other algorithms, achieving an accuracy of 87.04% and an F1-score of 85.06%. Eyes, mouth, and brows were identified as anger-sensitive facial areas. Additionally, incorporating individual characteristics into models further improved classification performance. This study provides a contactless and highly accessible approach for event-related driving anger detection.<b>Practitioner Summary:</b> This study proposed a cost-effective and contactless approach for event-related and real-time driving anger detection and could potentially provide insights into the design of emotional interactions in intelligent vehicles.</p>","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2024-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1080/00140139.2024.2418303","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0
Abstract
Driving anger is a serious global issue that poses risks to road safety, thus necessitating the development of effective detection and intervention methods. This study investigated the feasibility of using smartphones to capture facial expressions to detect event-related driving anger. Sixty drivers completed the driving tasks in scenarios with and without multi-stage road events and were induced to angry and neutral states, respectively. Their physiological signals, facial expressions, and subjective data were collected. Four feature combinations and six machine learning algorithms were used to construct driving anger detection models. The model combining facial features and the XGBoost algorithm outperformed models using physiological features or other algorithms, achieving an accuracy of 87.04% and an F1-score of 85.06%. Eyes, mouth, and brows were identified as anger-sensitive facial areas. Additionally, incorporating individual characteristics into models further improved classification performance. This study provides a contactless and highly accessible approach for event-related driving anger detection.Practitioner Summary: This study proposed a cost-effective and contactless approach for event-related and real-time driving anger detection and could potentially provide insights into the design of emotional interactions in intelligent vehicles.