scene_name.txt_sem_label.npy
Changyangli opened this issue · 4 comments
Hi,
Thanks again for releasing the source code. I have one more question regarding the gorilla-3d evaluator: For line 212 in "test.py" that evaluator.process(inputs, outputs)
, the gorilla-3d evaluator tries to read gt labels expected to be saved in "scene_name.txt_sem_label.npy" files. Where actually can I get those files? Seems that the "prepare_data.sh" does not generate such gt files as well. Thank!
Oh, thanks for your question.
I may miss the generation of semantic label, my label files have been generated long ago, so I do not meet this problem and I forget this preprocessing.
I will add the generation as soon as possible.
Thanks!!!
If you want to run the inference, you can ignore the evaluator
and run the inst_evaluator
only, which can process the instance segmentation evaluation individually.
I've fixed the semantic evaluation.
Please update the gorilla3d
(git pull origin dev
) and run the following script in the dataset root:(which is the last command in prepare_data.sh
)
# prepare validation dataset gt
python -m gorilla3d.preprocessing.scannetv2.segmentation.prepare_data_inst_gttxt --data-split val
it will generate the semantic segmentation label.
After that, you should evaluation the semantic segmentation successfully. The inference script:
CUDA_VISIBLE_DEVICES=0 python test.py --config config/default.yaml --pretrain pretrain.pth --eval --semantic
I've fixed the semantic evaluation.
Please update the
gorilla3d
(git pull origin dev
) and run the following script in the dataset root:(which is the last command inprepare_data.sh
)# prepare validation dataset gt python -m gorilla3d.preprocessing.scannetv2.segmentation.prepare_data_inst_gttxt --data-split val
it will generate the semantic segmentation label.
After that, you should evaluation the semantic segmentation successfully. The inference script:
CUDA_VISIBLE_DEVICES=0 python test.py --config config/default.yaml --pretrain pretrain.pth --eval --semantic
Cool, thanks for fixing the issue and replying so quickly! And again, thanks for sharing this amazing repo!