Table of Contents
The project aims to animate a metahuman in an augmented reality application.
This is the repository for the project of the course Computer Vision 2021-2022 @ UniTN made by Laiti Francesco and Lobba Davide.
The game engine used is Unity version 2021.3 and the application was tested on iOS and Android platforms.
The 3D model for the humanoid used in this project is the SMPL-X.
For the animation of the metahuman, we used the OptiTrack system available at the Multisensory Interactions Lab to track body movements.
We proposed a scenario where the metahuman is a personal trainer. The exercises are 4: warm-up, first training phase, second training phase and stretching.
-
Install Unity version 2021.3
NOTE: a different version is not guaranteed to work properly
-
Clone this repository
-
Open the scene cv_2022
-
Ensure that you have installed ARCore or ARKit packages in Unity
Window > Package Manager > install AR Foundation, ARCore XR, ARKit XR
-
Go to
Edit > Project Settings > XR-Plugin Management
check one of the boxes of ARKit or ARCore
-
Go to
File > Build Settings > choose your platform > switch platform
-
Build and run the project by using the command
File > Build And Run
-
Now you are ready to deploy the application on your device! 🚀
Otherwise, if you would like to test the scene in the Unity simulator, you have to adapt the code and the scene to work with the Unity simulator.
NOTE:
- For the iOS world, ARKit requires iOS 11.0 or later and an iOS device with an A9 or later processor. You also need the software XCode (only available on macOS) 😢
- For the Android world, you have to check if your device supports ARCore. Check it on https://developers.google.com/ar/devices
- The application was tested on an iPhone XS with iOS 15.5
For the animation of the metahuman we used a JSON file structured as follows:
.
├── Frame # Number of the frame
│ ├── Trans # Root translation
│ ├── Fullpose # List of coordinates x,y,z for each joint
│ └── Data # Coordinates of one joint
└── ...
We created the c3d file using the OptiTrack wear and a system of cameras and markers, in particular we used the application Motive to record movements.
Then, in order to convert the c3d file into a JSON file, we used Soma and MoSh++.
Later, we read the JSON file in the SMPL-X script and we animated the metahuman.
Successively, we implemented the animation of the metahuman which starts when we are close to it. We chose 4 meters, but you can change it in the SMPL-X script.
We tested the whole project in the Unity simulator to detect any errors or bugs before switching to AR.
For the AR implementation, we used the AR Foundation Kit provided by Unity. In particular, we created an AR session and an AR camera. When you start the application, the main camera of the scene will be the camera of your smartphone.
Later, we implemented the ground plane. When you start the application, the camera will detect planes in the room and you can choose where to instantiate the metahuman. It is important to say that the metahuman is a child of the ground plane, so its position depends on the ground plane.
- Open the application. The device scans your environment and it tries to find a ground plane where the metahuman will be instantiated
- Select one of the four modes available
- Place the metahuman wherever you want, just tap on the display 👆
- The animation is triggered when the camera is near the metahuman, so if you are far from it, go closer to the object. At the top of the screen you can see the distance from the metahuman.
- Enjoy the animation! ⚡
We provide two GIFs to show how the app looks like:
In the first GIF, we instantiate the SMPLX in a certain position and we animate it when we are close to the object.
In the second GIF we do the same steps as before but when we go far from the object (more than 4 meters), the animation correctly stops.
- Implement more animations
- Add voice to the SMPLX
- Consider a more high realistic texture
- Migrate the project to Unreal Engine
Distributed under the MIT License. See LICENSE
for more information.
Francesco Laiti - Github - Linkedin - UniTN email | Davide Lobba - Github - Linkedin - UniTN email
This project is only for educational purposes.
We thank the MMLab and the University of Trento for the opportunity to use the OptiTrack system available at the Multisensory Interactions Lab.
We used the 3D model human SMPL-X available at https://smpl-x.is.tue.mpg.de/