Machine Learning Interpretability — Shapley Values with PySpark
Mar 20, 2021 · 23 min read · Interpreting Isolation Forest’s predictions — and not only Photo by Joshua Golde on Unsplash The problem: how to interpret Isolation Forest’s predictions More specifically, how to tell which features are contributing more to the predictions. Since Is...
Join discussion



