pre-trained point cloud models for chairs in the pc_data
jiayaozhang opened this issue · 5 comments
Hi, I am very excited about this paper.
Currently, I am at a loss since I didn't see any pre-trained models for the chair, table, and storage categories in the code/model_output/pre_* folders.
Furthermore, after I download the pc_data from readme.md in pc_data folder,
I am a little confused about how to use make_shapeassembly_encodings.py, sample_partnet.py and train_pc_encoeder.py to generate the pre-trained model specifically. Would you mind give a script about how to do it specifically?
Thanks a lot!
Hi!
Thanks for your interest in the paper :)
Thanks for pointing out that the pre-trained models were missing! I just pushed them into the repo, so they should now appear
Unfortunately, for you second question, I don't currently have 1-script that will put all those scripts together, but I can walk through how you might do so.
-
First run sample_partnet.py. You will need to change the path to where you have downloaded the PartNet dataset (in line 38). The outputs will be put in a new folder called 'partnet_pc_sample'
-
Then run make_shapeassembly_encodings.py. You will need to change sa_enc (line 8) to point to the pre-trained encoder you want to use. Also change the categories on lines 20 and 27 if you switch from chair. This will save all the encodings of the shapes to a folder called sa_encs (make sure to make it before running the script)
-
Then run train_pc_encoder.py. You will need to change line 22 to the newly created partnet_pc_sample folder from step 1. Then you should be able to run it from the command line with something like python3 train_pc_encoder.py sa_enc {outdir}, where you specify where the output model files should be saved to in the {outdir} variable.
Let me know if you have additional questions!
Ah you can change
from hier_execute import hier_execute
to
from ShapeAssembly import hier_execute
Thanks!
I fix them all. :) 👍