This project is very experimental and in active development.
Experiments building a gesture recognition system using an Arduino, a Daydream controller and a mobile phone, with Tensorflow.js
Inspired by a similar project by Minko Gechev using the webcam.
Each project has 2 demos: one to play a game of street fighter, and one to predict magic wand movements.
Sprites used in the Street Fighter demo come from this Codepen and this repo
Using an accelerometer/gyroscope (MPU6050 for the Arduino, and built-in sensors for the Daydream and phone), we can record data streamed while performing a gesture. By repeating and recording gestures multiple times, we can feed all this data to a machine learning algorithm to find patterns in the data. Once a model is created, we can use it to predict new live data and classify it to use as input for an interface or device.
See arduino-mkr1000 folder.
See daydream folder.
See phone folder.
More details on in this blog post.