Alfian Tan, Joy Egede, R. Remenyte-Prescott, Michel Valstar, Don Sharkey
{"title":"An Automated Performance Evaluation of the Newborn Life Support Procedure","authors":"Alfian Tan, Joy Egede, R. Remenyte-Prescott, Michel Valstar, Don Sharkey","doi":"10.1109/RAMS51492.2024.10457793","DOIUrl":null,"url":null,"abstract":"This research is conducted to develop an automated action recognition method to evaluate the performance of the Newborn Life Support (NLS) procedure. It will be useful to find deviations in the procedure, such as missing steps and incorrect actions, which will reflect the reliability of the performing protocol. This method is also part of the work towards its integration with the NLS reliability model. A combination of image segmentation and action classification methods is used. The U-net Deep Learning model is trained to do segmentation on 18 objects. Every 150 consecutive segmented video frames are then grouped for action analysis. Four types of handcrafted features are extracted from every grouped image. A training strategy using traditional Machine Learning models is developed to deal with an imbalanced dataset, as well as to reduce the complexity of the system. The predicted action segment is visually examined to make sure of its practicality. Results show that the NLS first step of wet towel removal was correctly recognized in 23 of 23 videos (52.2%), indicating the potential usefulness of the model in determining if this critical action is performed correctly and at the right time.","PeriodicalId":518362,"journal":{"name":"2024 Annual Reliability and Maintainability Symposium (RAMS)","volume":"12 4","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2024-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2024 Annual Reliability and Maintainability Symposium (RAMS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RAMS51492.2024.10457793","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This research is conducted to develop an automated action recognition method to evaluate the performance of the Newborn Life Support (NLS) procedure. It will be useful to find deviations in the procedure, such as missing steps and incorrect actions, which will reflect the reliability of the performing protocol. This method is also part of the work towards its integration with the NLS reliability model. A combination of image segmentation and action classification methods is used. The U-net Deep Learning model is trained to do segmentation on 18 objects. Every 150 consecutive segmented video frames are then grouped for action analysis. Four types of handcrafted features are extracted from every grouped image. A training strategy using traditional Machine Learning models is developed to deal with an imbalanced dataset, as well as to reduce the complexity of the system. The predicted action segment is visually examined to make sure of its practicality. Results show that the NLS first step of wet towel removal was correctly recognized in 23 of 23 videos (52.2%), indicating the potential usefulness of the model in determining if this critical action is performed correctly and at the right time.