STMicroelectronics/stm32ai-modelzoo

Unsatisfactory results

141391 opened this issue · 2 comments

141391 commented

Hello,
I have deployed the project to the hardware, but after actual testing (I placed the development kit about 15~20CM away from my palm), I feel that the accuracy of some gestures is not very high, such as BreakTime and FlatHand. These are gestures that are relatively difficult to recognize, which is very different from the expected accuracy obtained on the test set during training. Is this normal? What should I improve?
Looking forward to your reply!
image

Hello @141391,
The datasheet of the VL53L8CX sensor indicates a field of view of 45° horizontally and vertically, this means for a distance of 20cm, the sensor perceives in a 16cm square. Depending on the size of your hands, it may be better to place the development kit 20 to 30cm away from your palm to let it perceive the background behind. Also it is not very intuitive but the BreakTime sign is detected when you do a timeout sign (basically a T with your hands).
The provided neural network is trained with a small dataset and therefore it is not very accurate. If you want to create a better dataset, optimized for the position and the distance between your hand and the sensor, you can create one easily by following this tutorial.

141391 commented

Hello,
Thanks for your reply, it helps me a lot!
I will continue to optimize it.