A novel machine learning approach for diagnosing diabetes with a self-explainable interface

Gangani Dharmarathne , Thilini N. Jayasinghe , Madhusha Bogahawaththa , D.P.P. Meddage , Upaka Rathnayake
{"title":"A novel machine learning approach for diagnosing diabetes with a self-explainable interface","authors":"Gangani Dharmarathne ,&nbsp;Thilini N. Jayasinghe ,&nbsp;Madhusha Bogahawaththa ,&nbsp;D.P.P. Meddage ,&nbsp;Upaka Rathnayake","doi":"10.1016/j.health.2024.100301","DOIUrl":null,"url":null,"abstract":"<div><p>This study introduces the first-ever self-explanatory interface for diagnosing diabetes patients using machine learning. We propose four classification models (Decision Tree (DT), K-nearest Neighbor (KNN), Support Vector Classification (SVC), and Extreme Gradient Boosting (XGB)) based on the publicly available diabetes dataset. To elucidate the inner workings of these models, we employed the machine learning interpretation method known as Shapley Additive Explanations (SHAP). All the models exhibited commendable accuracy in diagnosing patients with diabetes, with the XGB model showing a slight edge over the others. Utilising SHAP, we delved into the XGB model, providing in-depth insights into the reasoning behind its predictions at a granular level. Subsequently, we integrated the XGB model and SHAP’s local explanations into an interface to predict diabetes in patients. This interface serves a critical role as it diagnoses patients and offers transparent explanations for the decisions made, providing users with a heightened awareness of their current health conditions. Given the high-stakes nature of the medical field, this developed interface can be further enhanced by including more extensive clinical data, ultimately aiding medical professionals in their decision-making processes.</p></div>","PeriodicalId":73222,"journal":{"name":"Healthcare analytics (New York, N.Y.)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-01-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S2772442524000030/pdfft?md5=494bc571d60d347c01d68d0c317c4288&pid=1-s2.0-S2772442524000030-main.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Healthcare analytics (New York, N.Y.)","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772442524000030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This study introduces the first-ever self-explanatory interface for diagnosing diabetes patients using machine learning. We propose four classification models (Decision Tree (DT), K-nearest Neighbor (KNN), Support Vector Classification (SVC), and Extreme Gradient Boosting (XGB)) based on the publicly available diabetes dataset. To elucidate the inner workings of these models, we employed the machine learning interpretation method known as Shapley Additive Explanations (SHAP). All the models exhibited commendable accuracy in diagnosing patients with diabetes, with the XGB model showing a slight edge over the others. Utilising SHAP, we delved into the XGB model, providing in-depth insights into the reasoning behind its predictions at a granular level. Subsequently, we integrated the XGB model and SHAP’s local explanations into an interface to predict diabetes in patients. This interface serves a critical role as it diagnoses patients and offers transparent explanations for the decisions made, providing users with a heightened awareness of their current health conditions. Given the high-stakes nature of the medical field, this developed interface can be further enhanced by including more extensive clinical data, ultimately aiding medical professionals in their decision-making processes.

查看原文
分享 分享
微信好友 朋友圈 QQ好友 复制链接
本刊更多论文
利用可自我解释的界面诊断糖尿病的新型机器学习方法
本研究首次推出了利用机器学习诊断糖尿病患者的自解释界面。我们基于公开的糖尿病数据集提出了四种分类模型(决策树(DT)、K-近邻(KNN)、支持向量分类(SVC)和极梯度提升(XGB))。为了阐明这些模型的内部工作原理,我们采用了称为夏普利加法解释(SHAP)的机器学习解释方法。所有模型在诊断糖尿病患者方面都表现出了值得称道的准确性,其中 XGB 模型略胜一筹。利用 SHAP,我们深入研究了 XGB 模型,从细微处深入了解了其预测背后的推理。随后,我们将 XGB 模型和 SHAP 的局部解释整合到一个界面中,用于预测患者的糖尿病。该界面的作用至关重要,它可以对患者进行诊断,并对所做的决定提供透明的解释,从而让用户更加了解自己当前的健康状况。鉴于医疗领域的高风险性质,可以通过纳入更广泛的临床数据来进一步改进所开发的界面,最终为医疗专业人员的决策过程提供帮助。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 去求助
来源期刊
Healthcare analytics (New York, N.Y.)
Healthcare analytics (New York, N.Y.) Applied Mathematics, Modelling and Simulation, Nursing and Health Professions (General)
CiteScore
4.40
自引率
0.00%
发文量
0
审稿时长
79 days
期刊最新文献
An electrocardiogram signal classification using a hybrid machine learning and deep learning approach An inter-hospital performance assessment model for evaluating hospitals performing hip arthroplasty A data envelopment analysis model for optimizing transfer time of ischemic stroke patients under endovascular thrombectomy An investigation of Susceptible–Exposed–Infectious–Recovered (SEIR) tuberculosis model dynamics with pseudo-recovery and psychological effect A novel integrated logistic regression model enhanced with recursive feature elimination and explainable artificial intelligence for dementia prediction
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
现在去查看 取消
×
提示
确定
0
微信
客服QQ
Book学术公众号 扫码关注我们
反馈
×
意见反馈
请填写您的意见或建议
请填写您的手机或邮箱
已复制链接
已复制链接
快去分享给好友吧!
我知道了
×
扫码分享
扫码分享
Book学术官方微信
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术
文献互助 智能选刊 最新文献 互助须知 联系我们:info@booksci.cn
Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。
Copyright © 2023 Book学术 All rights reserved.
ghs 京公网安备 11010802042870号 京ICP备2023020795号-1