{"title":"一个可解释的人工智能驱动的早期预警系统,以解决患者再入院风险","authors":"Tanusree De, Ahmeduvesh Mevawala, Ramyasri Nemani","doi":"10.1109/nbec53282.2021.9618766","DOIUrl":null,"url":null,"abstract":"Hospital readmission is undesirable for all the involved parties, the patient, the hospital and the insurer. Readmission put patients at-risk for hospital acquired infections, medical errors and unfavorable outcomes. For hospitals, it leads to a gradual increase in operating expenses. For payers, readmission means additional cost. So, predicting the possibility of patient readmission is very critical and highly relevant for all the parties involved. There are powerful machine learning algorithms, like Random Forest, XGBoost, Neural Net that can be used to develop the predictive model to predict the probability of patient readmission. However, these models are all black box; they can give the prediction with high accuracy; however, they do not explain how they arrived at the prediction. Herein comes the role of Explainable AI. In this paper, we have developed a novel model-specific local explanation methodology to derive explanation at an individual patient level, considering the inner learning process of a Random-Forest model. The derived explanations from proposed methodology are human-interpretable irrespective of the complexities of the underlying Random-Forest model and the explanations provide guidance to the doctor for prescribing the necessary remedy to the patient to prevent him/her from readmission within thirty days of discharge.","PeriodicalId":297399,"journal":{"name":"2021 IEEE National Biomedical Engineering Conference (NBEC)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An Explainable AI powered Early Warning System to address Patient Readmission Risk\",\"authors\":\"Tanusree De, Ahmeduvesh Mevawala, Ramyasri Nemani\",\"doi\":\"10.1109/nbec53282.2021.9618766\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hospital readmission is undesirable for all the involved parties, the patient, the hospital and the insurer. Readmission put patients at-risk for hospital acquired infections, medical errors and unfavorable outcomes. For hospitals, it leads to a gradual increase in operating expenses. For payers, readmission means additional cost. So, predicting the possibility of patient readmission is very critical and highly relevant for all the parties involved. There are powerful machine learning algorithms, like Random Forest, XGBoost, Neural Net that can be used to develop the predictive model to predict the probability of patient readmission. However, these models are all black box; they can give the prediction with high accuracy; however, they do not explain how they arrived at the prediction. Herein comes the role of Explainable AI. In this paper, we have developed a novel model-specific local explanation methodology to derive explanation at an individual patient level, considering the inner learning process of a Random-Forest model. The derived explanations from proposed methodology are human-interpretable irrespective of the complexities of the underlying Random-Forest model and the explanations provide guidance to the doctor for prescribing the necessary remedy to the patient to prevent him/her from readmission within thirty days of discharge.\",\"PeriodicalId\":297399,\"journal\":{\"name\":\"2021 IEEE National Biomedical Engineering Conference (NBEC)\",\"volume\":\"51 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE National Biomedical Engineering Conference (NBEC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/nbec53282.2021.9618766\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE National Biomedical Engineering Conference (NBEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/nbec53282.2021.9618766","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Explainable AI powered Early Warning System to address Patient Readmission Risk
Hospital readmission is undesirable for all the involved parties, the patient, the hospital and the insurer. Readmission put patients at-risk for hospital acquired infections, medical errors and unfavorable outcomes. For hospitals, it leads to a gradual increase in operating expenses. For payers, readmission means additional cost. So, predicting the possibility of patient readmission is very critical and highly relevant for all the parties involved. There are powerful machine learning algorithms, like Random Forest, XGBoost, Neural Net that can be used to develop the predictive model to predict the probability of patient readmission. However, these models are all black box; they can give the prediction with high accuracy; however, they do not explain how they arrived at the prediction. Herein comes the role of Explainable AI. In this paper, we have developed a novel model-specific local explanation methodology to derive explanation at an individual patient level, considering the inner learning process of a Random-Forest model. The derived explanations from proposed methodology are human-interpretable irrespective of the complexities of the underlying Random-Forest model and the explanations provide guidance to the doctor for prescribing the necessary remedy to the patient to prevent him/her from readmission within thirty days of discharge.