Encode-Categorical-Features
Handling Categorical/Qualitative variables is an important step in data preprocessing.Many Machine learning algorithms can not understand categorical variables by themself unless we convert them to numerical values. The performance of ML algorithms is based on how Categorical variables are encoded. The results produced by the model varies from different encoding techniques used.
Categorical variables can be divided into two categories:
- Nominal (No order)
- Ordinal (some order).
There are many ways we can encode these categorical variables
- One Hot Encoding
- Label Encoding
- Ordinal Encoding
- Frequency or Count Encoding
- Binary Encoding
- Base-N Encoding
- Helmert Encoding
- Mean Encoding or Target Encoding
- Weight of Evidence Encoding
- Sum Encoder (Deviation Encoding or Effect Encoding)
- Leave One Out Encoding
- CatBoost Encoding
- James-Stein Encoding
- M-estimator Encoding
- Hashing Encoding
- Backward Difference Encoding
- Polynomial Encoding
- MultiLabelBinarizer
Following libraries are used to perform encoding.
!pip install scikit-learn
!pip install category-encoders
Below cheat-sheet is a guiding tool to select enconding method.
References
- Category Encoders Documentation
- https://medium.com/swlh/an-introduction-to-categorical-feature-encoding-in-machine-learning-cd0ca08c8232
- https://towardsdatascience.com/benchmarking-categorical-encoders-9c322bd77ee8
- https://towardsdatascience.com/all-about-categorical-variable-encoding-305f3361fd02
- https://towardsdatascience.com/an-easier-way-to-encode-categorical-features-d840ff6b3900