AndresCasado/pergamo

Can't find 'poses'

yyyqwq opened this issue · 12 comments

Hello, After I successfully installed Expose、pifuHD、Self-Correction-Human-Parsing on ubuntu18.04(windows 10 before) and ran their own demo,I still have some issues. Could you please help me?
I get photos from my own video by ffmpeg and try to get smpl params by using Expose. The error is "Key Error: 'poses' is not a file in the archive". I find that the *.npz don't have the "pose" key(There are things like 'body_pose.npy' 'full_pose.npy' in the *.npz). So it seems that we need to use openpose with Expose and pifuHD?

Thanks a lot!

Which script is giving you that error? Can you provide the full exception stack?

@AndresCasado
hello~
(kaolin) ycb@ycb:~/github/pergamo/encoder$ python process_amass_sequence.py
['00194.png_116_params.npz', '00125.png_095_params.npz', '00002.png_016_params.npz', '00111.png_108_params.npz', '00056.png_007_params.npz', '00022.png_024_params.npz', '00203.png_117_params.npz', '00213.png_011_params.npz', '00137.png_074_params.npz', '00097.png_211_params.npz', '00100.png_099_params.npz', '00196.png_186_params.npz', '00072.png_175_params.npz', '00019.png_180_params.npz', '00178.png_033_params.npz', '00099.png_205_params.npz', '00063.png_097_params.npz', '00207.png_178_params.npz', '00166.png_015_params.npz', '00201.png_140_params.npz', '00199.png_176_params.npz', '00004.png_083_params.npz', '00150.png_139_params.npz', '00156.png_155_params.npz', '00190.png_056_params.npz', '00075.png_181_params.npz', '00103.png_170_params.npz', '00216.png_172_params.npz', '00080.png_078_params.npz', '00053.png_102_params.npz', '00172.png_148_params.npz', '00155.png_217_params.npz', '00047.png_089_params.npz', '00070.png_030_params.npz', '00145.png_096_params.npz', '00066.png_105_params.npz', '00043.png_122_params.npz', '00147.png_067_params.npz', '00146.png_219_params.npz', '00185.png_109_params.npz', '00087.png_223_params.npz', '00140.png_090_params.npz', '00064.png_153_params.npz', '00215.png_112_params.npz', '00025.png_088_params.npz', '00001.png_008_params.npz', '00227.png_013_params.npz', '00212.png_208_params.npz', '00024.png_215_params.npz', '00188.png_129_params.npz', '00226.png_201_params.npz', '00141.png_063_params.npz', '00031.png_184_params.npz', '00003.png_131_params.npz', '00112.png_162_params.npz', '00012.png_113_params.npz', '00134.png_032_params.npz', '00168.png_025_params.npz', '00143.png_027_params.npz', '00173.png_005_params.npz', '00109.png_034_params.npz', '00048.png_023_params.npz', '00060.png_147_params.npz', '00038.png_022_params.npz', '00225.png_061_params.npz', '00221.png_114_params.npz', '00224.png_055_params.npz', '00069.png_167_params.npz', '00169.png_058_params.npz', '00214.png_128_params.npz', '00164.png_121_params.npz', '00204.png_224_params.npz', '00144.png_118_params.npz', '00015.png_169_params.npz', '00071.png_070_params.npz', '00198.png_048_params.npz', '00229.png_126_params.npz', '00058.png_125_params.npz', '00159.png_229_params.npz', '00073.png_093_params.npz', '00037.png_185_params.npz', '00013.png_200_params.npz', '00030.png_051_params.npz', '00007.png_160_params.npz', '00133.png_130_params.npz', '00021.png_039_params.npz', '00014.png_177_params.npz', '00174.png_161_params.npz', '00138.png_163_params.npz', '00209.png_069_params.npz', '00211.png_021_params.npz', '00193.png_037_params.npz', '00132.png_231_params.npz', '00135.png_191_params.npz', '00096.png_079_params.npz', '00191.png_164_params.npz', '00071.png_071_params.npz', '00197.png_141_params.npz', '00023.png_199_params.npz', '00184.png_136_params.npz', '00129.png_133_params.npz', '00153.png_098_params.npz', '00163.png_189_params.npz', '00095.png_218_params.npz', '00067.png_132_params.npz', '00082.png_182_params.npz', '00206.png_165_params.npz', '00128.png_120_params.npz', '00086.png_003_params.npz', '00098.png_156_params.npz', '00117.png_196_params.npz', '00034.png_043_params.npz', '00105.png_127_params.npz', '00040.png_036_params.npz', '00008.png_207_params.npz', '00210.png_031_params.npz', '00051.png_082_params.npz', '00160.png_174_params.npz', '00180.png_004_params.npz', '00183.png_086_params.npz', '00231.png_203_params.npz', '00223.png_134_params.npz', '00055.png_168_params.npz', '00005.png_042_params.npz', '00042.png_197_params.npz', '00115.png_018_params.npz', '00106.png_188_params.npz', '00010.png_123_params.npz', '00208.png_077_params.npz', '00171.png_049_params.npz', '00054.png_151_params.npz', '00079.png_045_params.npz', '00033.png_080_params.npz', '00061.png_214_params.npz', '00175.png_092_params.npz', '00018.png_111_params.npz', '00039.png_054_params.npz', '00050.png_084_params.npz', '00131.png_028_params.npz', '00107.png_143_params.npz', '00167.png_142_params.npz', '00006.png_085_params.npz', '00157.png_076_params.npz', '00057.png_001_params.npz', '00152.png_000_params.npz', '00170.png_101_params.npz', '00027.png_213_params.npz', '00220.png_146_params.npz', '00113.png_232_params.npz', '00074.png_157_params.npz', '00119.png_012_params.npz', '00104.png_150_params.npz', '00026.png_065_params.npz', '00041.png_227_params.npz', '00162.png_135_params.npz', '00127.png_052_params.npz', '00139.png_053_params.npz', '00177.png_050_params.npz', '00081.png_057_params.npz', '00028.png_091_params.npz', '00192.png_094_params.npz', '00176.png_233_params.npz', '00016.png_154_params.npz', '00136.png_171_params.npz', '00124.png_046_params.npz', '00202.png_149_params.npz', '00029.png_220_params.npz', '00102.png_145_params.npz', '00179.png_059_params.npz', '00182.png_064_params.npz', '00044.png_221_params.npz', '00151.png_107_params.npz', '00120.png_204_params.npz', '00107.png_144_params.npz', '00032.png_020_params.npz', '00154.png_017_params.npz', '00195.png_212_params.npz', '00222.png_073_params.npz', '00189.png_010_params.npz', '00092.png_152_params.npz', '00011.png_104_params.npz', '00049.png_062_params.npz', '00078.png_009_params.npz', '00106.png_187_params.npz', '00052.png_210_params.npz', '00085.png_138_params.npz', '00083.png_166_params.npz', '00181.png_038_params.npz', '00065.png_081_params.npz', '00118.png_103_params.npz', '00158.png_124_params.npz', '00101.png_066_params.npz', '00108.png_029_params.npz', '00186.png_115_params.npz', '00219.png_190_params.npz', '00228.png_230_params.npz', '00149.png_194_params.npz', '00045.png_040_params.npz', '00062.png_159_params.npz', '00116.png_035_params.npz', '00059.png_006_params.npz', '00035.png_179_params.npz', '00036.png_075_params.npz', '00093.png_202_params.npz', '00068.png_183_params.npz', '00217.png_026_params.npz', '00230.png_192_params.npz', '00205.png_002_params.npz', '00088.png_222_params.npz', '00121.png_225_params.npz', '00089.png_137_params.npz', '00046.png_119_params.npz', '00187.png_014_params.npz', '00114.png_060_params.npz', '00020.png_198_params.npz', '00148.png_047_params.npz', '00165.png_173_params.npz', '00122.png_195_params.npz', '00084.png_110_params.npz', '00123.png_158_params.npz', '00077.png_068_params.npz', '00130.png_100_params.npz', '00110.png_193_params.npz', '00094.png_106_params.npz', '00076.png_228_params.npz', '00017.png_226_params.npz', '00218.png_044_params.npz', '00200.png_087_params.npz', '00126.png_072_params.npz', '00091.png_209_params.npz', '00090.png_041_params.npz', '00142.png_019_params.npz', '00009.png_216_params.npz', '00161.png_206_params.npz']
Traceback (most recent call last):
File "process_amass_sequence.py", line 80, in
main_example()
File "process_amass_sequence.py", line 76, in main_example
process_sequences(directory, motion_paths)
File "process_amass_sequence.py", line 65, in process_sequences
process_sequence(directory, motion_path, betas, smpl_model)
File "process_amass_sequence.py", line 30, in process_sequence
global_orient = motion["poses"][:, 0:3]
File "/home/ycb/anaconda3/envs/kaolin/lib/python3.7/site-packages/numpy/lib/npyio.py", line 260, in getitem
raise KeyError("%s is not a file in the archive" % key)
KeyError: 'poses is not a file in the archive'

I use this command to get *.npz and then run process_amass_sequence.py first. The error is above.
python demo.py --image-folder samples \ --exp-cfg data/conf.yaml \ --show=False \ --output-folder OUTPUT_FOLDER \ --save-params [True/False] \ --save-vis [True/False] \ --save-mesh [True/False]
It seems that I need to get *.npz by this command below?
python inference.py --exp-cfg data/conf.yaml \ --datasets openpose \ --exp-opts datasets.body.batch_size B datasets.body.openpose.data_folder folder \ --show=[True/False] \ --output-folder OUTPUT_FOLDER \ --save-params [True/False] \ --save-vis [True/False] \ --save-mesh [True/False]

My target is driving SMPL model with my own video. And after I saw pifuHD's README, I found it pifuHD also may need openpose keypoints json(video only creates one json)

@AndresCasado

After using Expose with OpenPose, my *.npz contains betas.npybody_pose.npycenter.npyexpression.npyfname.npyfocal_length_in_mm.npyfocal_length_in_px.npyfull_pose.npyglobal_orient.npyjaw_pose.npyjoints.npyleft_hand_pose.npyproj_joints.npyright_hand_pose.npysensor_width.npyshitf_x.npyshift_y.npytransl.npyvertices.npyv_shaped.npuy
But not find Key ‘poses’ when run firstly the process_amass_sequences.py. The error is as below :

Traceback (most recent call last): File "process_amass_sequence.py", line 80, in main_example() File "process_amass_sequence.py", line 76, in main_example process_sequences(directory, motion_paths) File "process_amass_sequence.py", line 65, in process_sequences process_sequence(directory, motion_path, betas, smpl_model) File "process_amass_sequence.py", line 30, in process_sequence global_orient = motion["poses"][:, 0:3] File "/home/ycb/anaconda3/envs/kaolin/lib/python3.7/site-packages/numpy/lib/npyio.py", line 260, in getitem raise KeyError("%s is not a file in the archive" % key) KeyError: 'poses is not a file in the archive'

You need to use process_reconstructed_sequence to use your own data. You need to comment the amass scripts in run_regression.sh and use the reconstructed ones.

Sorry if it was not clear. Try that and let me know if it works.

@AndresCasado I think the reason may be that I don't convert the SMPLX to SMPL. So I let me have a try and then try your solution.

@AndresCasado
I have tried your solution and found something in the 'data/trained_sequence/poses/dan-005/._bp.pkl&&._enc.pkl'(some poses matrix datas). But Should I need to train again?
I just want to do something like your video demo like this below(left is my video and right is the rusult) :
image

Sorry, But now I don't to know how to get that.

PERGAMO is a method to reconstruct t-shirts and train a regressor to generate new t-shirt meshes with the same behaviour as the reconstructed ones

The video was done with video editing software as a visualization of the project. The blue mesh for the SMPL bodies was obtained just with ExPose

Are you trying to reconstruct t-shirts and/or regress them? If the answer is no then this issue is not related to this project

Yes, I am trying to do things like people dressing t-shirts correctly.

Ok, then you need to reconstruct your t-shirts first. Please check and edit run_recons.sh and try to execute it with the datasets we provide. Once you get to run it correctly, you need to create your own datasets with the prior models (ExPose, PifuHD, etc). Use the provided datasets as an example of the structure you need to use.

When you have that, you should be able to train a regressor for your dataset. For this step you can also use our dataset as an example first. Check and edit train_regressor.py.

Lastly, you'll be able to predict t-shirts. Check and edit run_regression.sh.

python reconstruction_script.py --dir DataDanBrown/reconstruction_input/
Tshirt template: ./data/tshirt_4424verts.obj. Exists? True
The md5 is not the same, but ok
All sequences are correct
Saving ExPose poses paths: 100%|███████████████████████████████████████████████| 217/217 [00:00<00:00, 354149.40it/s]
Saving SMPL-converted poses paths: 100%|██████████████████████████████████████| 434/434 [00:00<00:00, 1948959.25it/s]
Saving silhouettes paths: 100%|████████████████████████████████████████████████| 434/434 [00:00<00:00, 337347.65it/s]
Saving Pifu normals paths: 100%|███████████████████████████████████████████████| 217/217 [00:00<00:00, 416741.74it/s]
Traceback (most recent call last):
File "reconstruction_script.py", line 649, in
main()
File "reconstruction_script.py", line 645, in main
run_sequence_loading_data(processed_sequence_information)
File "reconstruction_script.py", line 427, in run_sequence_loading_data
for info_dict in info_dicts:
File "reconstruction_script.py", line 539, in process_sequence
'smpl_params': smpl_params[frame],
KeyError: 1

I have confirmed your issue.

The problem is the regex in line 489:

# region SMPL-converted poses
smpl_params_folder = os.path.join(base_directory, sequence_name, sequence_name + '_smpl')
smpl_pattern = re.compile(r'body_.*?(\d+).*\..*_\d+\.pkl')
smpl_params = {}
for file in tqdm.tqdm(os.listdir(smpl_params_folder), desc="Saving SMPL-converted poses paths"):
match = smpl_pattern.match(file)
if match:
frame = int(match[1])

The code expects the file to be named body_****.pkl, and the dataset does not follow this name system. You can just delete body_ from the regex and it will work.

Also, while testing this, I detected another issue. Please read the temporal solution #5

I closed this because I could run the code, but if you find another problem you can reopen it.