yunhe20/D-PCC

Question about prepare_semantickitti.py

Closed this issue · 5 comments

Hello! Thanks for your good job!
I have tried to run prepare_semantickitti.py to generate KITTI train and val dataset,but the process always be killed during pickle.dump.
I guess it's because out of memory. I would like to ask if you have encountered this situation,and how big your final semantickitti_train_cube_size_12.pkl file is?Thank you very much!

I also experienced same problem!

I also experienced same problem!

Have you ever run prepare_shapenet.py to process ShapeNet dataset successfully? I also encountered bugs in this process, just like 'Not a JPEG file: starts with 0x89 0x50'.

My solution takes some time.
For semantickitti, I resolved it by making separate pickles, and then merged them later.
For shapenet, I skipped the mesh files that have such an error.

Ok! Thank you very much!

My solution takes some time. For semantickitti, I resolved it by making separate pickles, and then merged them later. For shapenet, I skipped the mesh files that have such an error.

could you kindly describe how to skip the mesh files that have such error? thanks