TY - JOUR
T1 - A novel machine learning approach for diagnosing diabetes with a self-explainable interface
AU - Dharmarathne, Gangani
AU - Jayasinghe, Thilini N.
AU - Bogahawaththa, Madhusha
AU - Meddage, D. P.P.
AU - Rathnayake, Upaka
N1 - Publisher Copyright:
© 2024 The Authors
PY - 2024/6
Y1 - 2024/6
N2 - This study introduces the first-ever self-explanatory interface for diagnosing diabetes patients using machine learning. We propose four classification models (Decision Tree (DT), K-nearest Neighbor (KNN), Support Vector Classification (SVC), and Extreme Gradient Boosting (XGB)) based on the publicly available diabetes dataset. To elucidate the inner workings of these models, we employed the machine learning interpretation method known as Shapley Additive Explanations (SHAP). All the models exhibited commendable accuracy in diagnosing patients with diabetes, with the XGB model showing a slight edge over the others. Utilising SHAP, we delved into the XGB model, providing in-depth insights into the reasoning behind its predictions at a granular level. Subsequently, we integrated the XGB model and SHAP's local explanations into an interface to predict diabetes in patients. This interface serves a critical role as it diagnoses patients and offers transparent explanations for the decisions made, providing users with a heightened awareness of their current health conditions. Given the high-stakes nature of the medical field, this developed interface can be further enhanced by including more extensive clinical data, ultimately aiding medical professionals in their decision-making processes.
AB - This study introduces the first-ever self-explanatory interface for diagnosing diabetes patients using machine learning. We propose four classification models (Decision Tree (DT), K-nearest Neighbor (KNN), Support Vector Classification (SVC), and Extreme Gradient Boosting (XGB)) based on the publicly available diabetes dataset. To elucidate the inner workings of these models, we employed the machine learning interpretation method known as Shapley Additive Explanations (SHAP). All the models exhibited commendable accuracy in diagnosing patients with diabetes, with the XGB model showing a slight edge over the others. Utilising SHAP, we delved into the XGB model, providing in-depth insights into the reasoning behind its predictions at a granular level. Subsequently, we integrated the XGB model and SHAP's local explanations into an interface to predict diabetes in patients. This interface serves a critical role as it diagnoses patients and offers transparent explanations for the decisions made, providing users with a heightened awareness of their current health conditions. Given the high-stakes nature of the medical field, this developed interface can be further enhanced by including more extensive clinical data, ultimately aiding medical professionals in their decision-making processes.
KW - Diabetes
KW - Diagnosis
KW - Healthcare
KW - Machine learning
KW - Predictive analytics
KW - Self-explainable interface
UR - http://www.scopus.com/inward/record.url?scp=85183557031&partnerID=8YFLogxK
U2 - 10.1016/j.health.2024.100301
DO - 10.1016/j.health.2024.100301
M3 - Article
AN - SCOPUS:85183557031
SN - 2772-4425
VL - 5
JO - Healthcare Analytics
JF - Healthcare Analytics
M1 - 100301
ER -