using Pkg
pkg"add MLInterpret"
Try without installation using docker
docker run -it --rm astupidbear/mli
Or build it from Dockerfile
url=https://raw.githubusercontent.com/AStupidBear/MLInterpret.jl/master/Dockerfile.py
python3 -c "$(curl $url)"
using MLInterpret
using PyCall
using PyCallUtils
using PandasLite
X = DataFrame(randn(Float32, 10000, 5))
y = (X[3] > 0) & (X[2] >= 0)
@from lightgbm imports LGBMRegressor
model = LGBMRegressor()
model.fit(X, y)
You can interpret any machine learning model from Python which has a property .predict
by calling
interpret(model, X, y)
If your model dosen't have a property '.predict' (like Julia models), you can still interpret its predictions by
ŷ = model.predict(X)
interpret(X, ŷ)
This will generate a folder mli
in the current directory which contains
pdp.pdf
: partial dependency plot linkperturb_feaimpt.csv
: feature importance calculated by purturbation linkshap.pdf
: shap value linkshap2.pdf
: shap interaction value linksurrogate_tree-*.pdf
: surrogate tree linkactual.pdf
: actual plot linkactual2.pdf
: actual interaction plot link
MLI with H2O Driverless AI
docker run -d \
--pid=host \
--init \
-u `id -u`:`id -g` \
-p 12345:12345 \
-v /dev/shm:/dev/shm \
astupidbear/dai:1.7.0
You can get a trial license of H2O Driverless AI from H2O, then open http://127.0.0.1:12345/
, login and enter your license.
dai_interpret(X, y)
Open http://127.0.0.1:12345/
, click MLI
, choose the toppest Interpreted Model
MLI with Bayesian Rule List
using MLInterpret
MLInterpret.install_brl()
sbrl_interpret(X, y)
A file named sbrl.txt
will be created in your working directory.