/VirtualSign-classifier

Working repository for hand configuration classifier, part of the VirtualSign project.

Primary LanguagePythonMIT LicenseMIT

VirtualSign

Dataset format:

Datasets consist of 18 columns. The first one determines the hand configuration. The rest correspond to the data gloves sensors input. Each hand configuration is measured 10 times.

TODO list:

  • review architecture (add multiple filters)
  • Visualise w/ tf.tensorboard
  • create inferring script
  • transfer model building function to utils.py
  • k-fold cross validation test
  • create a model with an intermediate numper of parameters to 25k - 275k
  • rounding script for datasets
  • name columns appropriately
  • implement exploration/visualization scripts
  • batch evaluation script
  • use argparse to get inputs in all scripts
  • modeling name scheme (model name, val loss, epoch)
  • change script to work with directories or files
  • Prompt the user to calibrate and then turn off auto-calibration
  • Insert new dataset
  • Batch normalization
  • Test L2 regularization
  • Check decision trees and gradient boosting
  • Add evaluation in the end of the training script
  • train, train-val, val, test
  • Try L1 regulization (though the sparsity of the features is not certain)
  • decide stopping strategy (try early stopping)
  • explore how linearly separable is our data and try SVMs
  • consider separating the knuckle inputs with the finger inputs
  • integrate in Virtual Sign
  • Check unsupervised pre-training
  • hyper-parameter search script (check hyperas)
  • upload graph of the models
  • add the SVM classifier to Virtual Sign
  • use the new model of gloves: add pitch, roll and yaw inputs, adapt calibration process, output 57 instead of 42 classes
  • capture new dataset