The idea is to control your computer by detecting specific movements and behaviors on your laptop webcam by associating motions to actions.
This is a personal project whose goal is to fully implement and deploy a Deep Learning algorithm in a live setting.
But also to compare the complexity between a Machine Learning implementation and a hand crafted one using classical image processing methods such as those from OpenCV.
- Build a project from scratch to production
- Real understanding of the hyperparameters
- Reproduce teachable machines
- Make a model that runs on CPU and a small laptop and deployable to work at 60FPS with a webcam
- Control Spotify
- Control houses and clicks on computer
- Hand tracking recognition
- Finger tracking tutorial
- Detecting Hands and Recognizing Activities in Complex Egocentric Interactions
- Handmap blog
- Scikit image documentation
- Deep Learning for Integrated Hand Detection and Pose Estimation
- Deep Learning Based Hand Detectionin Cluttered Environment Using Skin Segmentation
- Real Time Full Hand tracking - and the github repo
- Nice tutorial on stackoverflow
- Paper on real time pose estimation - Repo and Chainer implementation
- Tutorial for Tensorflow implementation
- Database for hand gesture recognition
- Hand Dataset
- VIVA hand detection benchmark
- EgoHands
- Mujah dataset
- Google dataset
- CelebA
- Labelled faces in the wild