syntagmatic/leap-play

Relative Gestures

Opened this issue · 2 comments

Thanks for the the amazing gesture recording script. This is extremely useful but obviously every hand motion gesture can't produce the same gesture data every time. Do you have any suggestions on how to create relative data points that can be registered as a gesture, something close, but not exact?

As as a starting place, you can use the change in some value (such as X, Y, Z coordinates) from the previous step, rather than the absolute value.

The Leap JS library keeps a history of recent data point, or you could maintain this list yourself by modifying the gesture script. Just be sure to throw away data after 2-3 seconds, since it's not useful for detecting gestures anymore.

For example, to trigger a gesture when the user "swipes right", look for a large change in the X coordinate. A simple test like that might accidentally trigger due to noise (hands or fingers flickering incorrectly), so you might try layering a few simple tests on top of each other.

If you're comfortable with Mongo and Node.js, I also created a simple database-backed gesture recorder. It assembles the gallery automatically from saved gestures, and you can click gesture names to see the diagnostics page. My plan is to eventually to extend that diagnostics page to give some insight to all the data, to get some insight on which attributes are relevant for a particular movement type.

https://github.com/syntagmatic/prehensile

Thanks for the tip. I've started work on a relative gestures script now using js focusing on the basics like you said (change in X, Y, Z). When I get it done I'll post it to the OpenLeap repo for others to use.

I'm hoping that devs will be able to create any custom gesture with your recorder and be able to recognize it with my script. I'll let you know when I get something concrete finished.