Real-time body reconstruction and recognition in virtual reality using Vive Trackers and Controllers
This project implements an accurate full-body recognition approach for VR-based applications using HTC Vive Trackers and Controllers. Motion recognition is based on Hidden Markov Model. Currently, a Yoga Warrior I pose can be recognized with an accuracy of 88%. However, more models can be trained. Furthermore, we implemented an Inverse Kinematics solver to animate the motions of the avatar as smoothly, rapidly and as accurately as possible. Using a HTC Vive headset, two Vive Controllers and three Vive Trackers, attached to the hands and feet, it is possible to create immersive VR experiences, where the user is able to perceive the avatar as her/his own body. The user can see her/his avatar from the first-person perspective and in a virtual mirror.
- node.js
- VisualStudio or Xcode
git clone --recursive git@github.com:CatCuddler/BodyTracking.git
cd BodyTracking/BodyModel/Kore
git checkout master
git submodule update --init --recursive
cd ..
node Kore/make --vr steamvr
Currently Metal does not work. Use Opengl.
node Kore/make -g opengl
Open VisualStudio or Xcode project in BodyModel/build
Change to "Release" mode.
Change working directory in Xcode: Edit Scheme -> Use custom working directory -> choose deployment directory
- Strap one Vive Tracker on your left foot and another one on your right foot (above ankles)
- Strap the third Vive tracker on your waist
- Hold the Vive Controller in your hands ;)
- Start the project. You will see an avatar, standing in a T-Pose.
- Press the "grip button" to set the size of the avatar (you must look straight ahead)
- Go to where the avatar stands and put your feet and hands in the same position.
- Make sure that the green arrows for the controllers are pointing straight ahead.
- Press the "menu button" to calibrate the avatar.