/3D-Human-Pose-Perception-from-Egocentric-Stereo-Videos

Inference code of our CVPR 2024 paper, "3D Human Pose Perception from Egocentric Stereo Videos".

Primary LanguagePython

3D-Human-Pose-Perception-from-Egocentric-Stereo-Videos (CVPR 2024)

Official PyTorch inference code of our CVPR 2024 paper, "3D Human Pose Perception from Egocentric Stereo Videos".

img

For any questions, please contact the first author, Hiroyasu Akada [hakada@mpi-inf.mpg.de] .

[Project Page] [Benchmark Challenge]

Citation

    @inproceedings{hakada2024unrealego2,
      title = {3D Human Pose Perception from Egocentric Stereo Videos},
      author = {Akada, Hiroyasu and Wang, Jian and Golyanik, Vladislav and Theobalt, Christian},
      booktitle = {Computer Vision and Pattern Recognition (CVPR)},
      year = {2024}
    }

UnrealEgo2/UnrealEgo-RW Datasets

Download

You can download the UnrealEgo2/UnrealEgo-RW datasets on our benchmark challenge page.

Depths from SfM/Metashape

You can donwload depth data from SfM/Metashape described in our paper.

Note that these depth data are different from the synthetic depth maps available on our benchmark challenge page.

Implementation

Dependencies

We tested our code with the following dependencies:

  • Python 3.9
  • Ubuntu 18.04
  • PyTorch 2.0.0
  • Cuda 11.7

Please install other dependencies:

pip install -r requirements.txt    

Inference

Trained models

You can download our trained models. Please save them in ./log/(experiment_name).

Inference on UnrealEgo2 test dataset

    bash scripts/test/unrealego2_pose-qa-avg-df_data-ue2_seq5_skip3_B32_lr2-4_pred-seq_local-device_pad.sh

        --data_dir [path to the `UnrealEgoData2_test_rgb` dir]
        --metadata_dir [path to the `UnrealEgoData2_test_sfm` dir]

Please modify the arguments above. The pose predictions will be saved in ./results/UnrealEgoData2_test_pose (raw and zip versions).

Inference on UnrealEgo-RW test dataset

  • Model without pre-training on UnrealEgo2

      bash scripts/test/unrealego2_pose-qa-avg-df_data-ue-rw_seq5_skip3_B32_lr2-4_pred-seq_local-device_pad.sh
    
          --data_dir [path to the `UnrealEgoData_rw_test_rgb` dir]
          --metadata_dir [path to the `UnrealEgoData_rw_test_sfm` dir]
    
  • Model with pre-training on UnrealEgo2

      bash scripts/test/unrealego2_pose-qa-avg-df_data-ue2_seq5_skip3_B32_lr2-4_pred-seq_local-device_pad_finetuning_epoch5-5.sh
    
          --data_dir [path to the `UnrealEgoData_rw_test_rgb` dir]
          --metadata_dir [path to the `UnrealEgoData_rw_test_sfm` dir]
    

Please modify the arguments above. The pose predictions will be saved in ./results/UnrealEgoData_rw_test_pose (raw and zip versions).

For quantitative results of your methods, please follow the instructions in our benchmark challenge page and submit a zip version.