{"title":"Clinically meaningful interpretability of an AI model for ECG classification","authors":"Vadim Gliner, Idan Levy, Kenta Tsutsui, Moshe Rav Acha, Jorge Schliamser, Assaf Schuster, Yael Yaniv","doi":"10.1038/s41746-025-01467-8","DOIUrl":null,"url":null,"abstract":"<p>Despite the high accuracy of AI-based automated analysis of 12-lead ECG images for classification of cardiac conditions, clinical integration of such tools is hindered by limited interpretability of model recommendations. We aim to demonstrate the feasibility of a generic, clinical resource interpretability tool for AI models analyzing digitized 12-lead ECG images. To this end, we utilized the sensitivity of the Jacobian matrix to compute the gradient of the classifier for each pixel and provide medical relevance interpretability. Our methodology was validated using a dataset consisting of 79,226 labeled scanned ECG images, 11,316 unlabeled and 1807 labeled images obtained via mobile camera in clinical settings. The tool provided interpretability for both morphological and arrhythmogenic conditions, highlighting features in terms understandable to physician. It also emphasized significant signal features indicating the absence of certain cardiac conditions. High correlation was achieved between our method of interpretability and gold standard interpretations of 3 electrophysiologists.</p>","PeriodicalId":19349,"journal":{"name":"NPJ Digital Medicine","volume":"6 1","pages":""},"PeriodicalIF":12.4000,"publicationDate":"2025-02-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"NPJ Digital Medicine","FirstCategoryId":"3","ListUrlMain":"https://doi.org/10.1038/s41746-025-01467-8","RegionNum":1,"RegionCategory":"医学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"HEALTH CARE SCIENCES & SERVICES","Score":null,"Total":0}
引用次数: 0
Abstract
Despite the high accuracy of AI-based automated analysis of 12-lead ECG images for classification of cardiac conditions, clinical integration of such tools is hindered by limited interpretability of model recommendations. We aim to demonstrate the feasibility of a generic, clinical resource interpretability tool for AI models analyzing digitized 12-lead ECG images. To this end, we utilized the sensitivity of the Jacobian matrix to compute the gradient of the classifier for each pixel and provide medical relevance interpretability. Our methodology was validated using a dataset consisting of 79,226 labeled scanned ECG images, 11,316 unlabeled and 1807 labeled images obtained via mobile camera in clinical settings. The tool provided interpretability for both morphological and arrhythmogenic conditions, highlighting features in terms understandable to physician. It also emphasized significant signal features indicating the absence of certain cardiac conditions. High correlation was achieved between our method of interpretability and gold standard interpretations of 3 electrophysiologists.
期刊介绍:
npj Digital Medicine is an online open-access journal that focuses on publishing peer-reviewed research in the field of digital medicine. The journal covers various aspects of digital medicine, including the application and implementation of digital and mobile technologies in clinical settings, virtual healthcare, and the use of artificial intelligence and informatics.
The primary goal of the journal is to support innovation and the advancement of healthcare through the integration of new digital and mobile technologies. When determining if a manuscript is suitable for publication, the journal considers four important criteria: novelty, clinical relevance, scientific rigor, and digital innovation.