/UniversalHumanoidControl

Official Implementation of the Universal Humanoid Controller in Mujoco. Supports Kinpoly (NeurIPS 2021) and EmbodiedPose (NeurIPS 2022).

Primary LanguagePythonOtherNOASSERTION

UHC: Universal Humanoid Controller

TODOS:

  • Tutorial for smpl_robot
  • Runnable and trainalbe code for the SMPL model for motion imitation
  • Data processing code for the AMASS dataset
  • Perpetual Humanoid Controller Support (No RFC model)

News 🚩

[June 5th, 2023 ] Since there is a growing interest in using smpl_robot in Isaac (e.g. in Trace & Pace), I uploaded smpl_local_robot.py that is compatible with Isaac.

[March 31, 2023 ] Adding the implicit-shape model. Implicit RFC + training with different SMPL body shapes. Adding AMASS data processing code.

[February 24, 2023 ] Full code runnable

[February 22, 2023 ] Evaluation code released

Introduction

In this project, we develop the Universal Humanoid Controller used in our projects: embodiedpose, kin_poly, and agent_design. It is a physics-based humanoid controller trained with Reinforcement Learning to imitate reference human motion. UHC is task-agnostic and only takes reference frames as input. We uses the MuJoCo simulator for this repository. It relives heavily on residual force control for keeping the humanoid stable, and we are actively working on relying less on this dependency. Here are a few highlights of the controller:

  • Supports controlling humanoids constructed from SMPL, SMPL-H, and SMPL-X models, of all genders and body shapes.
  • Causal and takes in one-frame of reference frame as input.
  • Supports optimizing the humanoid's body shape paramters based on Transform2Act and agent_design.
  • Can support simulating mulitple humanoids in the same scene, though only as a proof of concept.

Dependencies

To create the environment, follow the following instructions:

  1. Create new conda environment and install pytroch:
conda create -n uhc python=3.8
conda install pytorch torchvision torchaudio pytorch-cuda=11.6 -c pytorch -c nvidia # install pytorch
pip install -r requirements.txt
  1. Download and setup mujoco: Mujoco
wget https://github.com/deepmind/mujoco/releases/download/2.1.0/mujoco210-linux-x86_64.tar.gz
tar -xzf mujoco210-linux-x86_64.tar.gz
mkdir ~/.mujoco
mv mujoco210 ~/.mujoco/
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:~/.mujoco/mujoco210/bin

SMPL Robot

The SMPL robot, an adaption of Robot class from Transform2Act, is an automatic humanoid generation class that supports SMPL, SMPL-H, and SMPL-X models. It creates XML file humanoid model for MuJoCo simulator, and can create humanoid of different gender and body shape. It supports both capsule based and mesh based models. We use SMPL Robot to create SMPL models to simulate on-the-fly when training our UHC with different body shapes. To run SMPL Robot, use:


python uhc/smpllib/smpl_robot.py

To generate Isaac Gym comptialbe humanoid models, use:

python uhc/smpllib/smpl_local_robot.py 

This model supports mesh and non-mesh based humanoid used in Trace & Pace. There is a number of setting you can play with like "upright_start", which generates a humanoid that that starts in an upright position facing x-direcation (default SMPL faces upward).

You will need to have downloaded smpl model files from SMPL, SMPL-H, and SMPL-X. Put them in the data/smpl folder, unzip them into 'data/smpl' folder. Please download the v1.1.0 version, which contains the neutral humanoid. Rename the files basicmodel_neutral_lbs_10_207_0_v1.1.0, basicmodel_m_lbs_10_207_0_v1.1.0.pkl, basicmodel_f_lbs_10_207_0_v1.1.0.pkl to SMPL_NEUTRAL.pkl, SMPL_MALE.pkl and SMPL_FEMALE.pkl. The file structure should look like this:


|-- data
    |-- smpl
        |-- SMPL_FEMALE.pkl
        |-- SMPL_NEUTRAL.pkl
        |-- SMPLH_MALE.pkl
        |-- SMPLH_FEMALE.pkl
        |-- SMPLH_NEUTRAL.pkl
        |-- SMPLX_MALE.pkl
        |-- SMPLX_FEMALE.pkl
        |-- SMPLX_NEUTRAL.pkl
        |-- SMPLX_MALE.pkl
        

Data processing for training & evaluating UHC

UHC is trained on the AMASS dataset. First, download the AMASS dataset from AMASS. Then, run the following script on the unzipped data:

python uhc/data_process/process_amass_raw.py

which dumps the data into the amass_db_smplh.pt file. Then, run

python uhc/data_process/process_amass_db.py

For processing your own SMPL data for evaluation, you can refer to

python uhc/data_process/process_smpl_data.py

Trained models

Download pretrained models:

You can also use the download_data script to download the models:

bash download_data.sh

Evaluation

python scripts/eval_uhc.py --cfg uhc_implicit --epoch 19000 --data sample_data/amass_copycat_take5_test_small.pkl
python scripts/eval_uhc.py --cfg uhc_implicit_shape --epoch 4700 --data sample_data/amass_copycat_take5_test_small.pkl
python scripts/eval_uhc.py --cfg uhc_explicit --epoch 5000 --data sample_data/amass_copycat_take5_test_small.pkl

For computing statistics (mpjpe, success rate, etc.), use the --mode stats command.

Training models

python scripts/train_uhc.py --cfg uhc_implicit_shape

Viewer Shortcuts

Keyboard Function
Q Next sample
Space pause
B hide expert
N seperate expert and imitation
M hide imitation
T Record screenshot
V Record video

Citation

If you find our work useful in your research, please cite our papers embodiedpose, kin_poly, and agent_design.

@inproceedings{Luo2022EmbodiedSH,
  title={Embodied Scene-aware Human Pose Estimation},
  author={Zhengyi Luo and Shun Iwase and Ye Yuan and Kris Kitani},
  booktitle={Advances in Neural Information Processing Systems},
  year={2022}
}

@inproceedings{Luo2021DynamicsRegulatedKP,
  title={Dynamics-Regulated Kinematic Policy for Egocentric Pose Estimation},
  author={Zhengyi Luo and Ryo Hachiuma and Ye Yuan and Kris Kitani},
  booktitle={Advances in Neural Information Processing Systems},
  year={2021}
}

@article{Luo2022FromUH,
  title={From Universal Humanoid Control to Automatic Physically Valid Character Creation},
  author={Zhengyi Luo and Ye Yuan and Kris M. Kitani},
  journal={ArXiv},
  year={2022},
  volume={abs/2206.09286}
}

References

This repository is built on top of the following amazing repositories:

  • Part of the UHC code is from: rfc
  • SMPL models and layer is from: SMPL-X model
  • Feature extractors are from: SPIN
  • NN modules are from (khrylib): DLOW