Data Skeptic
Data Skeptic
Oct 21, 2016
[MINI] Calculating Feature Importance
Play • 13 min

For machine learning models created with the random forest algorithm, there is no obvious diagnostic to inform you which features are more important in the output of the model. Some straightforward but useful techniques exist revolving around removing a feature and measuring the decrease in accuracy or Gini values in the leaves. We broadly discuss these techniques in this episode.

Search
Clear search
Close search
Google apps
Main menu