Machine Learning Interpretability — Shapley Values with PySpark
Interpreting Isolation Forest’s predictions — and not only
Photo by Joshua Golde on Unsplash
The problem: how to interpret Isolation Forest’s predictions
More specifically, how to tell which features are contributing more to the predictions. Since Is...
tryexceptfinally.hashnode.dev23 min read