
An introduction to explainable AI with Shapley values — SHAP latest ...
We will take a practical hands-on approach, using the shap Python package to explain progressively more complex models. This is a living document, and serves as an introduction to the shap Python …
API Reference — SHAP latest documentation
This page contains the API reference for public objects and functions in SHAP. There are also example notebooks available that demonstrate how to use the API of each object/function.
decision plot — SHAP latest documentation
SHAP Decision Plots SHAP decision plots show how complex models arrive at their predictions (i.e., how models make decisions). This notebook illustrates decision plot features and use cases with …
shap.Explainer — SHAP latest documentation
This is the primary explainer interface for the SHAP library. It takes any combination of a model and masker and returns a callable subclass object that implements the particular estimation algorithm …
Image examples — SHAP latest documentation
Image examples These examples explain machine learning models applied to image data. They are all generated from Jupyter notebooks available on GitHub. Image classification Examples using …
shap.DeepExplainer — SHAP latest documentation
Meant to approximate SHAP values for deep learning models. This is an enhanced version of the DeepLIFT algorithm (Deep SHAP) where, similar to Kernel SHAP, we approximate the conditional …
Explaining quantitative measures of fairness — SHAP latest …
By using SHAP (a popular explainable AI tool) we can decompose measures of fairness and allocate responsibility for any observed disparity among each of the model’s input features.
Release notes — SHAP latest documentation
Nov 11, 2025 · This release incorporates many changes that were originally contributed by the SHAP community via @dsgibbons 's Community Fork, which has now been merged into the main shap …
violin summary plot — SHAP latest documentation
The violin summary plot offers a compact representation of the distribution and variability of SHAP values for each feature. Individual violin plots are stacked by importance of the particular feature on …
waterfall plot — SHAP latest documentation
This notebook is designed to demonstrate (and so document) how to use the shap.plots.waterfall function. It uses an XGBoost model trained on the classic UCI adult income dataset (which is …