pyxyyy/gesture-drumkit

TODOs for ML

Closed this issue · 7 comments

// UPDATED

Model & Data Related:

  • fix class imbalance (oversampling or cutting out some data)
  • fix data quantity issue (size of (99,4) now, should be (100,4) or specify which sensor has 49)
  • simplify model maybe (have yet to test inference time on tf-lite)
  • reduce up gesture being class'ed as down gesture (pushed new model with much lower error margins; maybe needs to do better still?)
  • try alt model (optional, can try weka or alternative libs that integrate easily with android app)
  • collect more data (only have up/down now, at a fixed tempo)

App related:

  • put tflite into GestureRecognizer class
  • do simple estimate of inference duration
  • improve concurrency strategy in GestureRecognizer class

I'll be working on the app related side.

fix class imbalance (oversampling or cutting out some data)

Solved by:

  1. Add NON_GESTURE_SAMPLING_RATE = 0.1 to preprocessing.ipynb. Only a random portion of no-gesture samples is taken. Effectively improves the slicing speed.
  2. In training.ipynb, check the number of csv files in each class and take the minimum as the common class size.

fix data quantity issue (size of (99,4) now, should be (100,4) or specify which sensor has 49)

Add header=None to pd.read_csv().

Hmm actually I think it will be better if we included all negative data, since that's closer to test time conditions. @pyxyyy do continue the oversampling if you're working on it. You're probably more familiar with these concepts anyway.

yeah i agree oversampling is better; i’ll keep at it then

Turns out that tf-lite works pretty well.
Inference speed is also pretty fast (didn't benchmark but simple testing doesn't show any lag in recognition)

Problems:

  • Up gesture is detected as down gesture
    I've set the code to only treat down gestures as a beat. When up or down are detected, the recognizer waits for a cooldown of 500 ms before predicting anymore gestures.
    When doing a "down,up" motion, the up motion is often detected as down as well.
    It shows up as double beats in the drumkit ui. (right side of the screenshot)

image

If I only do a "down" motion and keep my hand there, recognizer only detects 1 beat.

  • recognizer dies after screen is off

Pushed a new model that should help a little with the up/down misclassification. Haven't had the chance to try it out on the watch yet (will need to be run through to_tflite.py first).

I'll try adding dropout, and maybe increasing the generation of non-gestures. Non-gestures tend to be mixed up with up gestures.