A Meta Package for Machine Learning Interpretation
Author AStupidBear
22 Stars
Updated Last
2 Years Ago
Started In
October 2019

Machine Learning Interpretation

Build Status Coverage


using Pkg
pkg"add MLInterpret"

Try without installation using docker

docker run -it --rm astupidbear/mli

Or build it from Dockerfile

python3 -c "$(curl $url)"


using MLInterpret
using PyCall
using PyCallUtils
using PandasLite
X = DataFrame(randn(Float32, 10000, 5))
y = (X[3] > 0) & (X[2] >= 0)
@from lightgbm imports LGBMRegressor
model = LGBMRegressor(), y)

You can interpret any machine learning model from Python which has a property .predict by calling

interpret(model, X, y)

If your model dosen't have a property '.predict' (like Julia models), you can still interpret its predictions by

= model.predict(X)
interpret(X, ŷ)

This will generate a folder mli in the current directory which contains

  • pdp.pdf: partial dependency plot link
  • perturb_feaimpt.csv: feature importance calculated by purturbation link
  • shap.pdf: shap value link
  • shap2.pdf: shap interaction value link
  • surrogate_tree-*.pdf: surrogate tree link
  • actual.pdf: actual plot link
  • actual2.pdf: actual interaction plot link

MLI with H2O Driverless AI

Start DAI

docker run -d \
    --pid=host \
    --init \
    -u `id -u`:`id -g` \
    -p 12345:12345 \
    -v /dev/shm:/dev/shm \

You can get a trial license of H2O Driverless AI from H2O, then open, login and enter your license.


dai_interpret(X, y)

Open, click MLI, choose the toppest Interpreted Model

MLI with Bayesian Rule List


using MLInterpret


sbrl_interpret(X, y)

A file named sbrl.txt will be created in your working directory.

Used By Packages

No packages found.