Comparison of Explainable Machine-Learning Models for Decision-Making in Health Intensive Care Using SHapley Additive exPlanations


Context: Intensive Care Units (ICUs) treat patients in serious condition, demanding qualified professional assistance, modern equipment for full-time monitoring of patients, information systems for data collection, medications and other supplies. Problem: Patients can recover or die, and sepsis is one of the main causes of death. Predicting the likelihood of death in sepsis patients can help coordinate medical efforts, as incorrect initial decisions can increase the mortality rate. However, it is important that prediction models with machine learning are explainable to medical staff, so decision-making may be made conscientiously. Solution: This study aimed to identify which Machine Learning algorithms are best for predicting death by sepsis using SHapley Additive exPlanations (SHAP) to provide explainable models. Theoretical Approach: The paper draws from information processing theories based on Machine Learning and explainable artificial intelligence models. Method: 196 observations of real data were used to create Machine Learning models. Data characteristics were analyzed, followed by missing data imputation, preprocessing, feature selection and training of predictive models for SVM, Random Forest, Logistic Regression, KNN and Decision Tree. Two metrics were used to validate the models: accuracy and f1-weighted. For each generated method, SHAP values and models to generate an explainable model listing the factors that most contributed to death predictions. Summary of results: The study showed algorithms with best algorithms were SVM and Logistic Regression (80% for both metrics). The results also showed what models converged in their interpretation using the SHAP values. Contributions and Impact on the IS area: The analysis of the models generated with applied to different machine learning algorithms allow for explainable and transparent analyses by health specialists in decision-making contexts.
Palavras-chave: Explainable Machine-Learning, SHAP, Health Intensive Care


Arun Das and Paul Rad. 2020. Opportunities and challenges in explainable artificial intelligence (xai): A survey. arXiv preprint arXiv:2006.11371 (2020).

Hong-Fei Deng, Ming-Wei Sun, Yu Wang, Jun Zeng, Ting Yuan, Ting Li, Di-Huan Li, Wei Chen, Ping Zhou, Qi Wang, et al. 2021. Evaluating machine learning models for sepsis prediction: A systematic review of methodologies. Iscience (2021), 103651.

Décio Diament, Salomão Reinaldo, Otelo Rigatto, Brenda Gomes, Eliezer Silva, Noêmia Barbosa Carvalho, and Flavia Ribeiro Machado. 2011. Guidelines for the treatment of severe sepsis and septic shock: management of the infectious agent - diagnosis. Revista Brasileira de Terapia Intensiva 23, 2 (2011), 134–144.

Edilvânia Brito Ferreira, Maria Eduarda Franco Araújo, Mara Régina Lucena Cabral, and Liberta Lamarta Favoritto Garcia Neres. 2021. Principais bactérias causadoras de sepse: sepse em unidade de terapia intensiva. Research, Society and Development 10, 14 (2021), e540101422455.

Pedro Celiny Ramos Garcia, Cristian Tedesco Tonial, and Jefferson Pedro Piva. 2020. Septic shock in pediatrics: the state-of-the-art. Jornal de pediatria 96 (2020), 87–98.

Li Y Wang F Hu B Peng Z. Hu C, Li L. 2022. Explainable Machine-Learning Model for Prediction of In-Hospital Mortality in Septic Patients Requiring Intensive Care Unit Readmission.Infect Dis Ther. (Aug 2022), 1695–1713.

Zhengyu Jiang, Lulong Bo, Zhenhua Xu, Yubing Song, Jiafeng Wang, Pingshan Wen, Xiaojian Wan, Tao Yang, Xiaoming Deng, and Jinjun Bian. 2021. An explainable machine learning algorithm for risk factor analysis of in-hospital mortality in sepsis survivors with ICU readmission. Computer Methods and Programs in Biomedicine 204 (2021), 106040.

Ke Li, Qinwen Shi, Siru Liu, Yilin Xie, and Jialin Liu. 2021. Predicting in-hospital mortality in ICU patients with sepsis using gradient boosting decision tree. Medicine 100, 19 (2021).

Scott M Lundberg and Su-In Lee. 2017. A Unified Approach to Interpreting Model Predictions. In Advances in Neural Information Processing Systems 30, I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett (Eds.). Curran Associates, Inc., 4765–4774. [link].

Fabian Pedregosa, Gaël Varoquaux, Alexandre Gramfort, Vincent Michel, Bertrand Thirion, Olivier Grisel, Mathieu Blondel, Peter Prettenhofer, Ron Weiss, Vincent Dubourg, Jake Vanderplas, Alexandre Passos, David Cournapeau, Matthieu Brucher, Matthieu Perrot, and Édouard Duchesnay. 2011. Scikit-learn: Machine Learning in Python. Journal of Machine Learning Research 12 (2011), 2825–2830.

Inger Persson, Andreas Östling, Martin Arlbrandt, Joakim Söderberg, David Becedas, et al. 2021. A Machine Learning Sepsis Prediction Algorithm for Intended Intensive Care Unit Use (NAVOY Sepsis): Proof-of-Concept Study. JMIR formative research 5, 9 (2021), e28000.

Paul Raccuglia, Katherine C Elbert, Philip DF Adler, Casey Falk, Malia B Wenny, Aurelio Mollo, Matthias Zeller, Sorelle A Friedler, Joshua Schrier, and Alexander J Norquist. 2016. Machine-learning-assisted materials discovery using failed experiments. Nature 533, 7601 (2016), 73–76.

Farah Shamout, Tingting Zhu, and David A. Clifton. 2021. Machine Learning for Clinical Outcome Prediction. IEEE Reviews in Biomedical Engineering 14 (2021), 116–126.

David W Shimabukuro, Christopher W Barton, Mitchell D Feldman, Samson J Mataraso, and Ritankar Das. 2017. Effect of a machine learning-based severe sepsis prediction algorithm on patient survival and hospital length of stay: a randomised clinical trial. BMJ open respiratory research 4, 1 (2017), e000234.

Gregor Stiglic, Primoz Kocbek, Nino Fijacko, Marinka Zitnik, Katrien Verbert, and Leona Cilar. 2020. Interpretability of machine learning‐based prediction models in healthcare. Wiley interdisciplinary reviews. Data mining and knowledge discovery 10, 5 (2020), n/a.

Athanasios Tsoukalas, Timothy Albertson, and Ilias Tagkopoulos. 2015. From data to optimal decision making: a data-driven, probabilistic machine learning approach to decision support for patients with sepsis. JMIR medical informatics 3, 1 (2015), e3445.

Jun Xu, Xiaojun Chen, and Xia Zheng. 2022. Acinetobacter baumannii complex-caused bloodstream infection in ICU during a 12-year period: Predicting fulminant sepsis by interpretable machine learning. Frontiers in Microbiology 13 (2022).

Kuo-Ching Yuan, Lung-Wen Tsai, Ko-Han Lee, Yi-Wei Cheng, Shou-Chieh Hsu, Yu-Sheng Lo, and Ray-Jade Chen. 2020. The development an artificial intelligence algorithm for early sepsis diagnosis in the intensive care unit. International journal of medical informatics 141 (2020), 104176.
Como Citar

Selecione um Formato
VIDAL, Igor Pereira; PEREIRA, Marluce Rodrigues; FREIRE, André Pimenta; RESENDE, Uanderson; MAZIERO, Erick Galani. Comparison of Explainable Machine-Learning Models for Decision-Making in Health Intensive Care Using SHapley Additive exPlanations. In: SIMPÓSIO BRASILEIRO DE SISTEMAS DE INFORMAÇÃO (SBSI), 19. , 2023, Maceió/AL. Anais [...]. Porto Alegre: Sociedade Brasileira de Computação, 2023 .

Artigos mais lidos do(s) mesmo(s) autor(es)