Yusuf Abas Mohamed , Bee Ee Khoo , Mohd Shahrimie Mohd Asaari , Mohd Ezane Aziz , Fattah Rahiman Ghazali
{"title":"解码黑箱:用于癌症诊断、预后和治疗规划的可解释人工智能(XAI)--最新系统综述。","authors":"Yusuf Abas Mohamed , Bee Ee Khoo , Mohd Shahrimie Mohd Asaari , Mohd Ezane Aziz , Fattah Rahiman Ghazali","doi":"10.1016/j.ijmedinf.2024.105689","DOIUrl":null,"url":null,"abstract":"<div><h3>Objective</h3><div>Explainable Artificial Intelligence (XAI) is increasingly recognized as a crucial tool in cancer care, with significant potential to enhance diagnosis, prognosis, and treatment planning. However, the holistic integration of XAI across all stages of cancer care remains underexplored. This review addresses this gap by systematically evaluating the role of XAI in these critical areas, identifying key challenges and emerging trends.</div></div><div><h3>Materials and methods</h3><div>Following the PRISMA guidelines, a comprehensive literature search was conducted across Scopus and Web of Science, focusing on publications from January 2020 to May 2024. After rigorous screening and quality assessment, 69 studies were selected for in-depth analysis.</div></div><div><h3>Results</h3><div>The review identified critical gaps in the application of XAI within cancer care, notably the exclusion of clinicians in 83% of studies, which raises concerns about real-world applicability and may lead to explanations that are technically sound but clinically irrelevant. Additionally, 87% of studies lacked rigorous evaluation of XAI explanations, compromising their reliability in clinical practice. The dominance of post-hoc visual methods like SHAP, LIME and Grad-CAM reflects a trend toward explanations that may be inherently flawed due to specific input perturbations and simplifying assumptions. The lack of formal evaluation metrics and standardization constrains broader XAI adoption in clinical settings, creating a disconnect between AI development and clinical integration. Moreover, translating XAI insights into actionable clinical decisions remains challenging due to the absence of clear guidelines for integrating these tools into clinical workflows.</div></div><div><h3>Conclusion</h3><div>This review highlights the need for greater clinician involvement, standardized XAI evaluation metrics, clinician-centric interfaces, context-aware XAI systems, and frameworks for integrating XAI into clinical workflows for informed clinical decision-making and improved outcomes in cancer care.</div></div>","PeriodicalId":54950,"journal":{"name":"International Journal of Medical Informatics","volume":"193 ","pages":"Article 105689"},"PeriodicalIF":3.7000,"publicationDate":"2024-11-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Decoding the black box: Explainable AI (XAI) for cancer diagnosis, prognosis, and treatment planning-A state-of-the art systematic review\",\"authors\":\"Yusuf Abas Mohamed , Bee Ee Khoo , Mohd Shahrimie Mohd Asaari , Mohd Ezane Aziz , Fattah Rahiman Ghazali\",\"doi\":\"10.1016/j.ijmedinf.2024.105689\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><h3>Objective</h3><div>Explainable Artificial Intelligence (XAI) is increasingly recognized as a crucial tool in cancer care, with significant potential to enhance diagnosis, prognosis, and treatment planning. However, the holistic integration of XAI across all stages of cancer care remains underexplored. This review addresses this gap by systematically evaluating the role of XAI in these critical areas, identifying key challenges and emerging trends.</div></div><div><h3>Materials and methods</h3><div>Following the PRISMA guidelines, a comprehensive literature search was conducted across Scopus and Web of Science, focusing on publications from January 2020 to May 2024. After rigorous screening and quality assessment, 69 studies were selected for in-depth analysis.</div></div><div><h3>Results</h3><div>The review identified critical gaps in the application of XAI within cancer care, notably the exclusion of clinicians in 83% of studies, which raises concerns about real-world applicability and may lead to explanations that are technically sound but clinically irrelevant. Additionally, 87% of studies lacked rigorous evaluation of XAI explanations, compromising their reliability in clinical practice. The dominance of post-hoc visual methods like SHAP, LIME and Grad-CAM reflects a trend toward explanations that may be inherently flawed due to specific input perturbations and simplifying assumptions. The lack of formal evaluation metrics and standardization constrains broader XAI adoption in clinical settings, creating a disconnect between AI development and clinical integration. Moreover, translating XAI insights into actionable clinical decisions remains challenging due to the absence of clear guidelines for integrating these tools into clinical workflows.</div></div><div><h3>Conclusion</h3><div>This review highlights the need for greater clinician involvement, standardized XAI evaluation metrics, clinician-centric interfaces, context-aware XAI systems, and frameworks for integrating XAI into clinical workflows for informed clinical decision-making and improved outcomes in cancer care.</div></div>\",\"PeriodicalId\":54950,\"journal\":{\"name\":\"International Journal of Medical Informatics\",\"volume\":\"193 \",\"pages\":\"Article 105689\"},\"PeriodicalIF\":3.7000,\"publicationDate\":\"2024-11-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Medical Informatics\",\"FirstCategoryId\":\"3\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1386505624003526\",\"RegionNum\":2,\"RegionCategory\":\"医学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Medical Informatics","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1386505624003526","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Decoding the black box: Explainable AI (XAI) for cancer diagnosis, prognosis, and treatment planning-A state-of-the art systematic review
Objective
Explainable Artificial Intelligence (XAI) is increasingly recognized as a crucial tool in cancer care, with significant potential to enhance diagnosis, prognosis, and treatment planning. However, the holistic integration of XAI across all stages of cancer care remains underexplored. This review addresses this gap by systematically evaluating the role of XAI in these critical areas, identifying key challenges and emerging trends.
Materials and methods
Following the PRISMA guidelines, a comprehensive literature search was conducted across Scopus and Web of Science, focusing on publications from January 2020 to May 2024. After rigorous screening and quality assessment, 69 studies were selected for in-depth analysis.
Results
The review identified critical gaps in the application of XAI within cancer care, notably the exclusion of clinicians in 83% of studies, which raises concerns about real-world applicability and may lead to explanations that are technically sound but clinically irrelevant. Additionally, 87% of studies lacked rigorous evaluation of XAI explanations, compromising their reliability in clinical practice. The dominance of post-hoc visual methods like SHAP, LIME and Grad-CAM reflects a trend toward explanations that may be inherently flawed due to specific input perturbations and simplifying assumptions. The lack of formal evaluation metrics and standardization constrains broader XAI adoption in clinical settings, creating a disconnect between AI development and clinical integration. Moreover, translating XAI insights into actionable clinical decisions remains challenging due to the absence of clear guidelines for integrating these tools into clinical workflows.
Conclusion
This review highlights the need for greater clinician involvement, standardized XAI evaluation metrics, clinician-centric interfaces, context-aware XAI systems, and frameworks for integrating XAI into clinical workflows for informed clinical decision-making and improved outcomes in cancer care.
期刊介绍:
International Journal of Medical Informatics provides an international medium for dissemination of original results and interpretative reviews concerning the field of medical informatics. The Journal emphasizes the evaluation of systems in healthcare settings.
The scope of journal covers:
Information systems, including national or international registration systems, hospital information systems, departmental and/or physician''s office systems, document handling systems, electronic medical record systems, standardization, systems integration etc.;
Computer-aided medical decision support systems using heuristic, algorithmic and/or statistical methods as exemplified in decision theory, protocol development, artificial intelligence, etc.
Educational computer based programs pertaining to medical informatics or medicine in general;
Organizational, economic, social, clinical impact, ethical and cost-benefit aspects of IT applications in health care.