Bridging Data and Clinical Insight: Explainable AI for ICU Mortality Risk Prediction

Ali H. Hassan, Riza bin Sulaiman, Mansoor Abdulhak, Hasan Kahtan

Research output: Contribution to journalArticlepeer-review

Abstract

Despite advancements in machine learning within healthcare, the majority of predictive models for ICU mortality lack interpretability, a crucial factor for clinical application. The complexity inherent in high-dimensional healthcare data and models poses a significant barrier to achieving accurate and transparent results, which are vital in fostering trust and enabling practical applications in clinical settings. This study focuses on developing an interpretable machine learning model for intensive care unit (ICU) mortality prediction using explainable AI (XAI) methods. The research aimed to develop a predictive model that could assess mortality risk utilizing the WiDS Datathon 2020 dataset, which includes clinical and physiological data from over 91,000 ICU admissions. The model's development involved extensive data preprocessing, including data cleaning and handling missing values, followed by training six different machine learning algorithms. The Random Forest model ranked as the most effective, with its highest accuracy and robustness to overfitting, making it ideal for clinical decision-making. The importance of this work lies in its potential to enhance patient care by providing healthcare professionals with an interpretable tool that can predict mortality risk, thus aiding in critical decision-making processes in high-acuity environments. The results of this study also emphasize the importance of applying explainable AI methods to ensure AI models are transparent and understandable to end-users, which is crucial in healthcare settings.

Original languageEnglish
Pages (from-to)743-750
Number of pages8
JournalInternational Journal of Advanced Computer Science and Applications
Volume16
Issue number2
DOIs
Publication statusPublished - 2025

Keywords

  • Explainable AI
  • healthcare
  • machine learning
  • predictive model

Cite this