/RLab_Facetracking

Table of Contents for our repos

Image of rlab

Welcome to Facetracking with the RLab!

Hello, this is a page that collects all of our repos for facetracking in one location. Now with ARKit and an iPhone with a front-facing True Depth camera, you can track your facial features allowing you animate digital avatars. The avatars first have to be prepared with point-level animation called blendshapes. ARKit uses 52 of these to animate the avatar faces. 3D modelling is a craft that takes a long time to master and setting up all of these blendshapes can be daunting, so we’ve developed a workflow to ease the barrier to get your character up and running in a matter of minutes by leveraging three free avatar creation tools: Reallusion, Adobe Fuse and MakeHuman. All three programs have more than enough blendshapes to animate a face, they just need to be renamed to the ARKit conventions. We’ve developed a simple script to automate of lot of this process.

Authors

Todd Bryant, Kat Sullivan, Grant Ng and Jiuxin Zhu

From the RLab!

Greetings from the RLab! To learn more about us and what we do visit our website here

  • These are all free softwares that are available to download.

Table of Contents

  1. Avatar Creation and Exporting
  2. Running our Scripts in Maya
  3. Importing In the Unreal Engine