Martin Michalowski , Szymon Wilk , Wojtek Michalowski , Malvika Rao , Marc Carrier
{"title":"Provision and evaluation of explanations within an automated planning-based approach to solving the multimorbidity problem","authors":"Martin Michalowski , Szymon Wilk , Wojtek Michalowski , Malvika Rao , Marc Carrier","doi":"10.1016/j.jbi.2024.104681","DOIUrl":null,"url":null,"abstract":"<div><p>The multimorbidity problem involves the identification and mitigation of adverse interactions that occur when multiple computer interpretable guidelines are applied concurrently to develop a treatment plan for a patient diagnosed with multiple diseases. Solving this problem requires decision support approaches which are difficult to comprehend for physicians. As such, the rationale for treatment plans generated by these approaches needs to be provided.</p></div><div><h3>Objective:</h3><p>To develop an explainability component for an automated planning-based approach to the multimorbidity problem, and to assess the fidelity and interpretability of generated explanations using a clinical case study.</p></div><div><h3>Methods:</h3><p>The explainability component leverages the task-network model for representing computer interpretable guidelines. It generates post-hoc explanations composed of three aspects that answer why specific clinical actions are in a treatment plan, why specific revisions were applied, and how factors like medication cost, patient’s adherence, etc. influence the selection of specific actions. The explainability component is implemented as part of MitPlan, where we revised our planning-based approach to support explainability. We developed an evaluation instrument based on the system causability scale and other vetted surveys to evaluate the fidelity and interpretability of its explanations using a two dimensional comparison study design.</p></div><div><h3>Results:</h3><p>The explainability component was implemented for MitPlan and tested in the context of a clinical case study. The fidelity and interpretability of the generated explanations were assessed using a physician-focused evaluation study involving 21 participants from two different specialties and two levels of experience. Results show that explanations provided by the explainability component in MitPlan are of acceptable fidelity and interpretability, and that the clinical justification of the actions in a treatment plan is important to physicians.</p></div><div><h3>Conclusion:</h3><p>We created an explainability component that enriches an automated planning-based approach to solving the multimorbidity problem with meaningful explanations for actions in a treatment plan. This component relies on the task-network model to represent computer interpretable guidelines and as such can be ported to other approaches that also use the task-network model representation. Our evaluation study demonstrated that explanations that support a physician’s understanding of the clinical reasons for the actions in a treatment plan are useful and important.</p></div>","PeriodicalId":15263,"journal":{"name":"Journal of Biomedical Informatics","volume":"156 ","pages":"Article 104681"},"PeriodicalIF":4.0000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Biomedical Informatics","FirstCategoryId":"3","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1532046424000996","RegionNum":2,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
The multimorbidity problem involves the identification and mitigation of adverse interactions that occur when multiple computer interpretable guidelines are applied concurrently to develop a treatment plan for a patient diagnosed with multiple diseases. Solving this problem requires decision support approaches which are difficult to comprehend for physicians. As such, the rationale for treatment plans generated by these approaches needs to be provided.
Objective:
To develop an explainability component for an automated planning-based approach to the multimorbidity problem, and to assess the fidelity and interpretability of generated explanations using a clinical case study.
Methods:
The explainability component leverages the task-network model for representing computer interpretable guidelines. It generates post-hoc explanations composed of three aspects that answer why specific clinical actions are in a treatment plan, why specific revisions were applied, and how factors like medication cost, patient’s adherence, etc. influence the selection of specific actions. The explainability component is implemented as part of MitPlan, where we revised our planning-based approach to support explainability. We developed an evaluation instrument based on the system causability scale and other vetted surveys to evaluate the fidelity and interpretability of its explanations using a two dimensional comparison study design.
Results:
The explainability component was implemented for MitPlan and tested in the context of a clinical case study. The fidelity and interpretability of the generated explanations were assessed using a physician-focused evaluation study involving 21 participants from two different specialties and two levels of experience. Results show that explanations provided by the explainability component in MitPlan are of acceptable fidelity and interpretability, and that the clinical justification of the actions in a treatment plan is important to physicians.
Conclusion:
We created an explainability component that enriches an automated planning-based approach to solving the multimorbidity problem with meaningful explanations for actions in a treatment plan. This component relies on the task-network model to represent computer interpretable guidelines and as such can be ported to other approaches that also use the task-network model representation. Our evaluation study demonstrated that explanations that support a physician’s understanding of the clinical reasons for the actions in a treatment plan are useful and important.
期刊介绍:
The Journal of Biomedical Informatics reflects a commitment to high-quality original research papers, reviews, and commentaries in the area of biomedical informatics methodology. Although we publish articles motivated by applications in the biomedical sciences (for example, clinical medicine, health care, population health, and translational bioinformatics), the journal emphasizes reports of new methodologies and techniques that have general applicability and that form the basis for the evolving science of biomedical informatics. Articles on medical devices; evaluations of implemented systems (including clinical trials of information technologies); or papers that provide insight into a biological process, a specific disease, or treatment options would generally be more suitable for publication in other venues. Papers on applications of signal processing and image analysis are often more suitable for biomedical engineering journals or other informatics journals, although we do publish papers that emphasize the information management and knowledge representation/modeling issues that arise in the storage and use of biological signals and images. System descriptions are welcome if they illustrate and substantiate the underlying methodology that is the principal focus of the report and an effort is made to address the generalizability and/or range of application of that methodology. Note also that, given the international nature of JBI, papers that deal with specific languages other than English, or with country-specific health systems or approaches, are acceptable for JBI only if they offer generalizable lessons that are relevant to the broad JBI readership, regardless of their country, language, culture, or health system.