Fig. 6
From: Predicting diabetic retinopathy based on routine laboratory tests by machine learning algorithms

SHAP explained global feature importance for XGBoost model. A Bar chart of the mean absolute SHAP value for each predictor. The inset PieDonut contains categorized features (out ring) and single variable contributions (inner ring). B SHAP summary plot. The dot's color represents the magnitude of the feature value, with red denoting higher values and blue indicating lower values. Its horizontal position corresponds to the SHAP value, reflecting the direction and strength of the feature’s influence on the model's output. SHAP, Shapley Additive exPlanation; Abbreviations as in Tables 1–5