Subhash Nerella, Julie Cupka, Matthew Ruppert, Patrick Tighe, Azra Bihorac, Parisa Rashidi
{"title":"Pain Action Unit Detection in Critically Ill Patients.","authors":"Subhash Nerella, Julie Cupka, Matthew Ruppert, Patrick Tighe, Azra Bihorac, Parisa Rashidi","doi":"10.1109/compsac51774.2021.00094","DOIUrl":null,"url":null,"abstract":"<p><p>Existing pain assessment methods in the intensive care unit rely on patient self-report or visual observation by nurses. Patient self-report is subjective and can suffer from poor recall. In the case of non-verbal patients, behavioral pain assessment methods provide limited granularity, are subjective, and put additional burden on already overworked staff. Previous studies have shown the feasibility of autonomous pain expression assessment by detecting Facial Action Units (AUs). However, previous approaches for detecting facial pain AUs are historically limited to controlled environments. In this study, for the first time, we collected and annotated a pain-related AU dataset, <i>Pain-ICU</i>, containing 55,085 images from critically ill adult patients. We evaluated the performance of OpenFace, an open-source facial behavior analysis tool, and the trained AU R-CNN model on our <i>Pain-ICU</i> dataset. Variables such as assisted breathing devices, environmental lighting, and patient orientation with respect to the camera make AU detection harder than with controlled settings. Although OpenFace has shown state-of-the-art results in general purpose AU detection tasks, it could not accurately detect AUs in our <i>Pain-ICU</i> dataset (F1-score 0.42). To address this problem, we trained the AU R-CNN model on our <i>Pain-ICU</i> dataset, resulting in a satisfactory average F1-score 0.77. In this study, we show the feasibility of detecting facial pain AUs in uncontrolled ICU settings.</p>","PeriodicalId":74502,"journal":{"name":"Proceedings : Annual International Computer Software and Applications Conference. COMPSAC","volume":"2021 ","pages":"645-651"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8552410/pdf/nihms-1747870.pdf","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings : Annual International Computer Software and Applications Conference. COMPSAC","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/compsac51774.2021.00094","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2021/9/9 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Existing pain assessment methods in the intensive care unit rely on patient self-report or visual observation by nurses. Patient self-report is subjective and can suffer from poor recall. In the case of non-verbal patients, behavioral pain assessment methods provide limited granularity, are subjective, and put additional burden on already overworked staff. Previous studies have shown the feasibility of autonomous pain expression assessment by detecting Facial Action Units (AUs). However, previous approaches for detecting facial pain AUs are historically limited to controlled environments. In this study, for the first time, we collected and annotated a pain-related AU dataset, Pain-ICU, containing 55,085 images from critically ill adult patients. We evaluated the performance of OpenFace, an open-source facial behavior analysis tool, and the trained AU R-CNN model on our Pain-ICU dataset. Variables such as assisted breathing devices, environmental lighting, and patient orientation with respect to the camera make AU detection harder than with controlled settings. Although OpenFace has shown state-of-the-art results in general purpose AU detection tasks, it could not accurately detect AUs in our Pain-ICU dataset (F1-score 0.42). To address this problem, we trained the AU R-CNN model on our Pain-ICU dataset, resulting in a satisfactory average F1-score 0.77. In this study, we show the feasibility of detecting facial pain AUs in uncontrolled ICU settings.