/DepthX

A reimagined, gesture-controlled car experience. Use hand gestures to control your music, take phone calls, and send slack messages, all from your car. Featured on Google TensorFlow Community Event!

Primary LanguageJavaScriptMIT LicenseMIT

DepthSense

Inspiration

Modern luxury cars often provide gesture control systems for their hardware and sound, that transform driving from just a necessity to an enjoyable experience. Our aim was to make a platform to expand these features into a universally compatible standard. DepthSense is designed to work seamlessly with any car, and will revolutionize the way we drive. DepthSense has significant advantages over voice control due to its extremely intuitive nature.

Functionality

It is a mobile application that uses AI and camera vision to control Spotify music using hand gestures while driving. DepthSense uses Tensorflow.js models to estimate the hand pose in 3d space, and detect the corresponding gestures. We then hook this up the Spotify API, authenticated using OAuth2, to play your favourite music!

Tech stack

  • React Native + Expo
  • Tensorflow.js Handpose
  • Spotify API

Demo video

https://www.youtube.com/watch?v=bwnxjMELG5A