Welcome to the WANDR (CVPR-24) project repository! Thank you for your interest in our work. Fell free to open issues or email me about any suggestions/problems you encounter. This guide will help you set up and get started with the project.
This initial commit allows you to download and run our model. Soon code and data for training and evaluating will follow. Stay tuned!
- Markos Diomataris (1,2)
- Nikos Athanasiou (1)
- Omid Taheri (1)
- Xi Wang (2)
- Otmar Hilliges (2)
- Michael J. Black (1)
- Max Planck Institute for Intelligent Systems, Tübingen, Germany
- ETH Zürich, Switzerland
Follow these steps to set up your environment and download the necessary resources.
- Visit the SMPL-X website.
- Create an account if you don't have one.
- Navigate to the "Download" section.
- Download
SMPL-X v1.1
and place it in the root directory of this project.
Run the scripts/setup.sh
script to set up your environment:
sh scripts/setup.sh
- Extract the previously downloaded SMPL-X zip file and place the contents under
./data/body_models
. - Set up a Python virtual environment named
wandr_env
.
After running the setup script, activate the virtual environment with:
source wandr_env/bin/activate
python demo.py
This should generate an output.mp4
rendering of the produced motion.
@inproceedings{diomataris2024wandr,
title = {{WANDR}: Intention-guided Human Motion Generation},
author = {Diomataris, Markos and Athanasiou, Nikos and Taheri, Omid and Wang, Xi and Hilliges, Otmar and Black, Michael J.},
booktitle = {Proceedings IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
year = {2024},
}
This repository was implemented by Markos Diomataris. Contact me at markos.diomataris@tuebingen.mpg.de