python3 trackbartest.py
(or sudo python3 trackbartest.py
if you want sound changing to work).
Prerequisites are python 3.5.x / 3.6.x with OpenCV 3.2.4 / 3.3.0 and numpy. The inialization step is rising both hands into the circle. Afterwards you can do steps showcased in the demo (we'll add a link once it's in the video).
In scenarios where you don't have access to mouse & keyboard, one way to manipulate a computer/TV is by using gestures captured by the camera. This is often portrayed in sci-fi movies (including "Iron Man"), but is quite hard to implement. We decided to give it a try :)
Our project allows us to recognize over 5 different gestures. We bound some of them to actions on the computer, such as changing the volume or brightness. Additionally, we provide an AR-like experience for easier interaction. And since we are using a webcam, why not build a couple of trendy face filters, such as an Iron Man mask.
In python with using OpenCV, with most algorithms for gesture detection done from scratch (except face detection with Haar cascades). We also explored a lua tool for OS X manipulation called Hammerspoon, but ended up finding simpler substitutes for it.
It's really hard to detect human palms and fingers - it's not diverse enough for Haar cascades to work, and simpler methods by color-filtering are very sensitive to light and background.
Having gone through dozens of 3rd party implementations that showed poor results, we ended up building something from scratch that works really well when there is no background interference.
Implementing the missile launcher interface :)