This project aims to put your understanding of machine learning algorithms into practice with a real world problem.
you can download all the packages used using conda or pip
-
This will create new environment from scratch with all listed modules.
conda env create -f environment.yml
if this option stuck at solving environment step, you may need to run : conda config --set channel_priority strict
-
If you already have an env and want to install the modules only you can use The --prune option removes any packages that are not listed in the environment.yml file.
conda env update -f environment.yml
-
Or using pip
pip install -r requirements-pip.txt
Hand Gesture of the Colombian sign language
download https://drive.google.com/drive/u/2/folders/1o9wzwaJVfrbpCFJ0rIyed1QvARh0JAtn
unzip the zipfile and put it all into data folder
It will handle everything from downloading into extracting data into /data folder.
However you would need to have gdown library installed. (included in requirements.txt)
We perform illumination processing in order to remove shadows and segment the hand and the background
We detect each hand's orientation and orient all images in the same direction
We use:
- Hog Features
- RI HOG Features
- LBP Features
- SIFT Features
- DAISY Features
- Fourrier Descriptor Features
- ORB Features
- Hu moments Features
- Convex Hull Features
- Elliptical fourrier descriptor features
We found DAISY features to give the best results We then perform PCA to reduce the feature's dimensionality
We use a small Neural Network with 2 layers We use ReLU activation functions, categorical cross entropy loss function and adam optimizer
Accuracy: 79.6%