/categorical-feature-importances-without-one-hot-encoding-dummies

Feature Importance of categorical variables by converting them into dummy variables (One-hot-encoding) can skewed or hard to interpret results. Here I present a method to get around this problem using H2O.

Primary LanguageJupyter Notebook

categorical-feature-importances-without-one-hot-encoding-dummies

Feature Importance of categorical variables by converting them into dummy variables (One-hot-encoding) can give skewed or hard to interpret results. Here I present a method to get around this problem using H2O. For this demostration, I used bank marketing data set given here: https://archive.ics.uci.edu/ml/datasets/Bank+Marketing

I also uploaded a detailed analysis of the bank marketing data where I produced feature importances using dummy variables. Please find it here: https://github.com/amalik2205/predicting-term-deposit-subscription-tendencies-of-clients