ymxlzgy/commonscenes

dec_sdf TypeError: 'NoneType' object is not subscriptable

Closed this issue · 4 comments

Hello,

thanks for sharing your work!
I was trying to train and test CommonScenes (i.e., v2_full), but I met some problems during training. Following README "--with_SDF: set to True if train v1_full, and not used in other cases", with_SDF was set as False. However, it turns out to an error 'NoneType' object is not subscriptable at VAEGAN_V2FULL.py (sdf_candidates = dec_sdfs[np.where(dec_objs_to_scene == i)[0]]). Could you please give me any suggestions to solve this issue please?

Hello,

thanks for sharing your work! I was trying to train and test CommonScenes (i.e., v2_full), but I met some problems during training. Following README "--with_SDF: set to True if train v1_full, and not used in other cases", with_SDF was set as False. However, it turns out to an error 'NoneType' object is not subscriptable at VAEGAN_V2FULL.py (sdf_candidates = dec_sdfs[np.where(dec_objs_to_scene == i)[0]]). Could you please give me any suggestions to solve this issue please?

Hi, thanks for trying. Can you try the shell command once CommonScenes, and tell me your output? with_SDF should be set as True. If so, README is not right I guess, and I would correct it.

@ymxlzgy many thanks for your reply!

The usage of with_SDF is clear to me now with your help. I set with_SDF as True so that the shape branch can be trained. Another problem appears: "No such file or directory" for ".../3D-FUTURE-SDF/a7621218-bd3a-48ab-9694-9f307bb4ba1e/ori_sample_grid.h5''. Could you please provide guidelines on preparing those .h5 SDF files?

First of all, thank you for your contribution to SG-FRONT dataset! It is very helpful for 3D scene synthesis. But I am a little confused about the input data format. Following README, I've finished data preparation steps, including 1) download 3D-FUTURE and 3D-FRONT, 2) preprocess them with ATISS guidelines, 3) download SG-FRONT (without running automatic labeling scripts because I suppose these scripts are not necessary, are they?). I'm not sure about the structure under your folder "/media/ymxlzgy/Data/Dataset/3D-FRONT" (i.e., parameter --dataset). Can you please show a demo? Besides, the pre-processed data (by ATISS) seems not be used.

If I've missed anything, please let me know. Thank you so much~~

@ymxlzgy many thanks for your reply!

The usage of with_SDF is clear to me now with your help. I set with_SDF as True so that the shape branch can be trained. Another problem appears: "No such file or directory" for ".../3D-FUTURE-SDF/a7621218-bd3a-48ab-9694-9f307bb4ba1e/ori_sample_grid.h5''. Could you please provide guidelines on preparing those .h5 SDF files?

First of all, thank you for your contribution to SG-FRONT dataset! It is very helpful for 3D scene synthesis. But I am a little confused about the input data format. Following README, I've finished data preparation steps, including 1) download 3D-FUTURE and 3D-FRONT, 2) preprocess them with ATISS guidelines, 3) download SG-FRONT (without running automatic labeling scripts because I suppose these scripts are not necessary, are they?). I'm not sure about the structure under your folder "/media/ymxlzgy/Data/Dataset/3D-FRONT" (i.e., parameter --dataset). Can you please show a demo? Besides, the pre-processed data (by ATISS) seems not be used.

If I've missed anything, please let me know. Thank you so much~~

Hi,
Thanks for bearing with me. Now, I have corrected the README and added a link to download 3D-FUTURE-SDF. This is processed on the 3D-FUTURE meshes using tools provided by SDFusion. You can click here for more info. We set a fixed resolution for each mesh. If you want to have higher resolution ones, you need to reproduce 3D-FUTURE-SDF by adjusting hyperparameters in the scripts. For your second question, it is actually quite a mess under my folder...haha...Haven't found time to clean it. But I have added an example of the structure in the README for your reference. Can you check it and tell me if more problems show up?

Best,
Guangyao

@weiyao1996 Hi, I will close this issue for now. If you still have questions, feel free to open it again or start a new issue to let me know!